ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (240)
  • Molecular Diversity Preservation International  (125)
  • MDPI  (115)
  • Institute of Electrical and Electronics Engineers (IEEE)
  • 2020-2022
  • 2015-2019  (240)
  • 2010-2014
  • 1990-1994
  • 1945-1949
  • 2019  (240)
  • 2010
  • Risks  (115)
  • 195054
  • Economics  (240)
  • Computer Science
Collection
  • Articles  (240)
Publisher
  • Molecular Diversity Preservation International  (125)
  • MDPI  (115)
  • Institute of Electrical and Electronics Engineers (IEEE)
Years
  • 2020-2022
  • 2015-2019  (240)
  • 2010-2014
  • 1990-1994
  • 1945-1949
Year
Topic
  • Economics  (240)
  • Computer Science
  • 1
    Publication Date: 2019
    Description: In this paper, we measure the systemic risk with a novel methodology, based on a “spatial-temporal” approach. We propose a new bank systemic risk measure to consider the two components of systemic risk: cross-sectional and time dimension. The aim is to highlight the “time-space dynamics” of contagion, i.e., if the CDS spread of bank i depends on the CDS spread of other banks. To do this, we use an advanced spatial econometrics design with a time-varying spatial dependence that can be interpreted as an index of the degree of cross-sectional spillovers. The findings highlight that the Eurozone banks have strong spatial dependence in the evolution of CDS spread, namely the contagion effect is present and persistent. Moreover, we analyse the role of the European Central Bank in managing contagion risk. We find that monetary policy has been effective in reducing systemic risk. However, the results show that systemic risk does not imply a policy intervention, highlighting how financial stability policy is not yet an objective.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2019
    Description: The sector of SME is the major force for the national economic and social development. Financial risk is one of the key threats to the activity of small and medium enterprises. The most common manifestation of the financial risk of SMEs is difficulty in financing the business and lack of funds for development. Banks are unwilling to grant loans to such companies. Moreover, it is the rising operating costs that cause shrinking profits, which may result in corporate debt, difficulty in debt repayment, and consequently, high financial risk of these entities. Numerous differences in conducting the activity of small and large enterprises intensify this risk and mean that the model of credit financing for companies is not adjusted to the capabilities and principles of the operation of small enterprises. Therefore, risk management is one of the most important internal processes in small and medium enterprises. The identification of factors that affect the level of financial risk in these entities is therefore crucial. The main objective of this research was to analyze the impact of selected parametric characteristics of the SME sector on the intensity of financial risk they take. This objective was accomplished on the basis of the survey with the participation of Polish SMEs. In order to test the adopted research assumptions, the linear regression model was used with four continuous variables for each type of the identified financial risk. Based on the final research results, the logit model was obtained for the risk of insufficient profits. It was indicated that both the internationalization of the company and the ability to manage risk are the only factors that affect a high level of risk of low income. The article ends with the discussion and the comparison with some previous research in this area.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2019
    Description: We consider a two-dimensional ruin problem where the surplus process of business lines is modelled by a two-dimensional correlated Brownian motion with drift. We study the ruin function P ( u ) for the component-wise ruin (that is both business lines are ruined in an infinite-time horizon), where u is the same initial capital for each line. We measure the goodness of the business by analysing the adjustment coefficient, that is the limit of − ln P ( u ) / u as u tends to infinity, which depends essentially on the correlation ρ of the two surplus processes. In order to work out the adjustment coefficient we solve a two-layer optimization problem.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2019
    Description: Since the 2008–2009 financial crisis, banks have introduced a family of X-valuation adjustments (XVAs) to quantify the cost of counterparty risk and of its capital and funding implications. XVAs represent a switch of paradigm in derivative management, from hedging to balance sheet optimization. They reflect market inefficiencies that should be compressed as much as possible. In this work, we present a genetic algorithm applied to the compression of credit valuation adjustment (CVA), the expected cost of client defaults to a bank. The design of the algorithm is fine-tuned to the hybrid structure, both discrete and continuous parameter, of the corresponding high-dimensional and nonconvex optimization problem. To make intensive trade incremental XVA computations practical in real-time as required for XVA compression purposes, we propose an approach that circumvents portfolio revaluation at the cost of disk memory, storing the portfolio exposure of the night so that the exposure of the portfolio augmented by a new deal can be obtained at the cost of computing the exposure of the new deal only. This is illustrated by a CVA compression case study on real swap portfolios.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2019
    Description: This paper revisits the spectrally negative Lévy risk process embedded with the general tax structure introduced in Kyprianou and Zhou (2009). A joint Laplace transform is found concerning the first down-crossing time below level 0. The potential density is also obtained for the taxed Lévy risk process killed upon leaving [ 0 , b ] . The results are expressed using scale functions.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2019
    Description: We obtain closed-form expressions for the value of the joint Laplace transform of the running maximum and minimum of a diffusion-type process stopped at the first time at which the associated drawdown or drawup process hits a constant level before an independent exponential random time. It is assumed that the coefficients of the diffusion-type process are regular functions of the current values of its running maximum and minimum. The proof is based on the solution to the equivalent inhomogeneous ordinary differential boundary-value problem and the application of the normal-reflection conditions for the value function at the edges of the state space of the resulting three-dimensional Markov process. The result is related to the computation of probability characteristics of the take-profit and stop-loss values of a market trader during a given time period.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2019
    Description: The Growth-Optimal Portfolio (GOP) theory determines the path of bet sizes that maximize long-term wealth. This multi-horizon goal makes it more appealing among practitioners than myopic approaches, like Markowitz’s mean-variance or risk parity. The GOP literature typically considers risk-neutral investors with an infinite investment horizon. In this paper, we compute the optimal bet sizes in the more realistic setting of risk-averse investors with finite investment horizons. We find that, under this more realistic setting, the optimal bet sizes are considerably smaller than previously suggested by the GOP literature. We also develop quantitative methods for determining the risk-adjusted growth allocations (or risk budgeting) for a given finite investment horizon.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2019
    Description: In small populations, mortality rates are characterized by a great volatility, the datasets are often available for a few years and suffer from missing data. Therefore, standard mortality models may produce high uncertain and biologically improbable projections. In this paper, we deal with the mortality projections of the Maltese population, a small country with less than 500,000 inhabitants, whose data on exposures and observed deaths suffers from all the typical problems of small populations. We concentrate our analysis on older adult mortality. Starting from some recent suggestions in the literature, we assume that the mortality of a small population can be modeled starting from the mortality of a bigger one (the reference population) adding a spread. The first part of the paper is dedicated to the choice of the reference population, then we test alternative mortality models. Finally, we verify the capacity of the proposed approach to reduce the volatility of the mortality projections. The results obtained show that the model is able to significantly reduce the uncertainty of projected mortality rates and to ensure their coherent and biologically reasonable evolution.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2019
    Description: This paper assesses the hedge effectiveness of an index-based longevity swap and a longevity cap for a life annuity portfolio. Although longevity swaps are a natural instrument for hedging longevity risk, derivatives with non-linear pay-offs, such as longevity caps, provide more effective downside protection. A tractable stochastic mortality model with age dependent drift and volatility is developed and analytical formulae for prices of longevity derivatives are derived. The model is calibrated using Australian mortality data. The hedging of the life annuity portfolio is comprehensively assessed for a range of assumptions for the longevity risk premium, the term to maturity of the hedging instruments, as well as the size of the underlying annuity portfolio. The results compare the risk management benefits and costs of longevity derivatives with linear and nonlinear payoff structures.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2019
    Description: In this paper, we apply machine learning to forecast the conditional variance of long-term stock returns measured in excess of different benchmarks, considering the short- and long-term interest rate, the earnings-by-price ratio, and the inflation rate. In particular, we apply in a two-step procedure a fully nonparametric local-linear smoother and choose the set of covariates as well as the smoothing parameters via cross-validation. We find that volatility forecastability is much less important at longer horizons regardless of the chosen model and that the homoscedastic historical average of the squared return prediction errors gives an adequate approximation of the unobserved realised conditional variance for both the one-year and five-year horizon.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2019
    Description: The purpose of this study is to examine the volatility-timing performance of Singapore-based funds under the Central Provident Fund (CPF) Investment Scheme and non-CPF linked funds by taking into account the currency risk effect on internationally managed funds. In particular, we empirically assess whether the funds under the CPF Investment Scheme outperform non-CPF funds by examining the volatility-timing performance associated with these funds. The volatility-timing ability of CPF funds will provide the CPF board with a new method for risk classification. We employ the GARCH models and modified factor models to capture the response of funds to market abnormal conditional volatility including the weekday effect. The SMB and HML factors for non-US based funds are constructed from stock market data to exclude the contribution of the size effect and the BE/ME effect. The results show that volatility timing is one of the factors contributing to the excess return of funds. However, funds’ volatility-timing seems to be country-specific. Most of the Japanese equity funds and global equity funds under the CPF Investment Scheme are found to have the ability of volatility timing. This finding contrasts with the existing studies on Asian, ex-Japan funds and Greater China funds. Moreover, there is no evidence that funds under the CPF Investment Scheme show a better group performance of volatility timing.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2019
    Description: The study of connectedness is key to assess spillover effects and identify lead-lag relationships among market exchanges trading the same asset. By means of an extension of Diebold and Yilmaz (2012) econometric connectedness measures, we examined the relationships of five major Bitcoin exchange platforms during two periods of main interest: the 2017 surge in prices and the 2018 decline. We concluded that Bitfinex and Gemini are leading exchanges in terms of return spillover transmission during the analyzed time-frame, while Bittrexs act as a follower. We also found that connectedness of overall returns fell substantially right before the Bitcoin price hype, whereas it leveled out during the period the down market period. We confirmed that the results are robust with regards to the modeling strategies.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2019
    Description: In this paper, we solve the problem of mid price movements arising in high-frequency and algorithmic trading using real data. Namely, we introduce different new types of General Compound Hawkes Processes (GCHPDO, GCHP2SDO, GCHPnSDO) and find their diffusive limits to model the mid price movements of 6 stocks-EBAY, FB, MU, PCAR, SMH, CSCO. We also define error rates to estimate the models fitting accuracy. Maximum Likelihood Estimation (MLE) and Particle Swarm Optimization (PSO) are used for Hawkes processes and models parameters’ calibration.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2019
    Description: This paper studies the optimal investment and consumption strategies in a two-asset model. A dynamic Value-at-Risk constraint is imposed to manage the wealth process. By using Value at Risk as the risk measure during the investment horizon, the decision maker can dynamically monitor the exposed risk and quantify the maximum expected loss over a finite horizon period at a given confidence level. In addition, the decision maker has to filter the key economic factors to make decisions. Considering the cost of filtering the factors, the decision maker aims to maximize the utility of consumption in a finite horizon. By using the Kalman filter, a partially observed system is converted to a completely observed one. However, due to the cost of information processing, the decision maker fails to process the information in an arbitrarily rational manner and can only make decisions on the basis of the limited observed signals. A genetic algorithm was developed to find the optimal investment, consumption strategies, and observation strength. Numerical simulation results are provided to illustrate the performance of the algorithm.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2019
    Description: Value at Risk (VaR) is used to illustrate the maximum potential loss under a given confidence level, and is just a single indicator to evaluate risk ignoring any information about income. The present paper will generalize one-dimensional VaR to two-dimensional VaR with income-risk double indicators. We first construct a double-VaR with ( μ , σ 2 ) (or ( μ , V a R 2 ) ) indicators, and deduce the joint confidence region of ( μ , σ 2 ) (or ( μ , V a R 2 ) ) by virtue of the two-dimensional likelihood ratio method. Finally, an example to cover the empirical analysis of two double-VaR models is stated.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2019
    Description: This paper explores the stochastic collocation technique, applied on a monotonic spline, as an arbitrage-free and model-free interpolation of implied volatilities. We explore various spline formulations, including B-spline representations. We explain how to calibrate the different representations against market option prices, detail how to smooth out the market quotes, and choose a proper initial guess. The technique is then applied to concrete market options and the stability of the different approaches is analyzed. Finally, we consider a challenging example where convex spline interpolations lead to oscillations in the implied volatility and compare the spline collocation results with those obtained through arbitrage-free interpolation technique of Andreasen and Huge.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2019
    Description: We provide ready-to-use formulas for European options prices, risk sensitivities, and P&L calculations under Lévy-stable models with maximal negative asymmetry. Particular cases, efficiency testing, and some qualitative features of the model are also discussed.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2019
    Description: To reduce the negative impacts of risks in farming due to climate change, the government implemented agricultural production cost insurance in 2015. Although a huge amount of subsidy has been allocated by the government (80 percent of the premium), farmers’ participation rate is still low (23 percent of the target in 2016). In order to solve the issue, it is indispensable to identify farmers’ willingness to pay (WTP) for and determinants of their participation in agricultural production cost insurance. Based on a field survey of 240 smallholder farmers in the Garut District, West Java Province in August–October 2017 and February 2018, the contingent valuation method (CVM) estimated that farmers’ mean willingness to pay (WTP) was Rp 30,358/ha/cropping season ($2.25/ha/cropping season), which was 16 percent lower than the current premium (Rp 36,000/ha/cropping season = $2.67/ha/cropping season). Farmers who participated in agricultural production cost insurance shared some characteristics: operating larger farmland, more contact with agricultural extension service, lower expected production for the next cropping season, and a downstream area location.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2019
    Description: It is known that the classical ruin function under exponential claim-size distribution depends on two parameters, which are referred to as the mean claim size and the relative security loading. These parameters are assumed to be unknown and random, thus, a loss function that measures the loss sustained by a decision-maker who takes as valid a ruin function which is not correct can be considered. By using squared-error loss function and appropriate distribution function for these parameters, the issue of estimating the ruin function derives in a mixture procedure. Firstly, a bivariate distribution for mixing jointly the two parameters is considered, and second, different univariate distributions for mixing both parameters separately are examined. Consequently, a catalogue of ruin probability functions and severity of ruin, which are more flexible than the original one, are obtained. The methodology is also extended to the Pareto claim size distribution. Several numerical examples illustrate the performance of these functions.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2019
    Description: XGBoost is recognized as an algorithm with exceptional predictive capacity. Models for a binary response indicating the existence of accident claims versus no claims can be used to identify the determinants of traffic accidents. This study compared the relative performances of logistic regression and XGBoost approaches for predicting the existence of accident claims using telematics data. The dataset contained information from an insurance company about the individuals’ driving patterns—including total annual distance driven and percentage of total distance driven in urban areas. Our findings showed that logistic regression is a suitable model given its interpretability and good predictive capacity. XGBoost requires numerous model-tuning procedures to match the predictive performance of the logistic regression model and greater effort as regards to interpretation.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2019
    Description: In this paper, we develop a framework for measuring, allocating and managing systemic risk. SystRisk, our measure of total systemic risk, captures the a priori cost to society for providing tail-risk insurance to the financial system. Our allocation principle distributes the total systemic risk among individual institutions according to their size-shifted marginal contributions. To describe economic shocks and systemic feedback effects, we propose a reduced form stochastic model that can be calibrated to historical data. We also discuss systemic risk limits, systemic risk charges and a cap and trade system for systemic risk.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2019
    Description: While the main conceptual issue related to deposit insurances is the moral hazard risk, the main technical issue is inaccurate calibration of the implied volatility. This issue can raise the risk of generating an arbitrage. In this paper, first, we discuss that by imposing the no-moral-hazard risk, the removal of arbitrage is equivalent to removing the static arbitrage. Then, we propose a simple quadratic model to parameterize implied volatility and remove the static arbitrage. The process of removing the static risk is as follows: Using a machine learning approach with a regularized cost function, we update the parameters in such a way that butterfly arbitrage is ruled out and also implementing a calibration method, we make some conditions on the parameters of each time slice to rule out calendar spread arbitrage. Therefore, eliminating the effects of both butterfly and calendar spread arbitrage make the implied volatility surface free of static arbitrage.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2019
    Description: Banks in India have been gone through structural changes in the last three decades. The prices that bank charge depend on the competitive levels in the banking sector and the risk the assets and liabilities carry in banks’ balance sheet. The traditional Lerner Index indicates competitive levels. However, this measure does not account for the risk, and this study introduces a risk-adjusted Lerner Index for evaluating competition in Indian banking for the period 1996 to 2016. The market power estimated through the adjusted Lerner Index has been declining since 1996, which indicates an improvement in competitive condition for the overall period. Further, as indicated by risk-adjusted Lerner Index, the Indian banking system exerts much less market power and hence are more competitive contrary to what is suggested by traditional Lerner index.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2019
    Description: In the field of mortality, the Lee–Carter based approach can be considered the milestone to forecast mortality rates among stochastic models. We could define a “Lee–Carter model family” that embraces all developments of this model, including its first formulation (1992) that remains the benchmark for comparing the performance of future models. In the Lee–Carter model, the κ t parameter, describing the mortality trend over time, plays an important role about the future mortality behavior. The traditional ARIMA process usually used to model κ t shows evident limitations to describe the future mortality shape. Concerning forecasting phase, academics should approach a more plausible way in order to think a nonlinear shape of the projected mortality rates. Therefore, we propose an alternative approach the ARIMA processes based on a deep learning technique. More precisely, in order to catch the pattern of κ t series over time more accurately, we apply a Recurrent Neural Network with a Long Short-Term Memory architecture and integrate the Lee–Carter model to improve its predictive capacity. The proposed approach provides significant performance in terms of predictive accuracy and also allow for avoiding the time-chunks’ a priori selection. Indeed, it is a common practice among academics to delete the time in which the noise is overflowing or the data quality is insufficient. The strength of the Long Short-Term Memory network lies in its ability to treat this noise and adequately reproduce it into the forecasted trend, due to its own architecture enabling to take into account significant long-term patterns.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2019
    Description: With the expected discontinuation of the LIBOR publication, a robust fallback for related financial instruments is paramount. In recent months, several consultations have taken place on the subject. The results of the first ISDA consultation have been published in November 2018 and a new one just finished at the time of writing. This note describes issues associated to the proposed approaches and potential alternative approaches in the framework and the context of quantitative finance. It evidences a clear lack of details and lack of measurability of the proposed approaches which would not be achievable in practice. It also describes the potential of asymmetrical information between market participants coming from the adjustment spread computation. In the opinion of this author, a fundamental revision of the fallback’s foundations is required.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2019
    Description: The purpose of this paper is to survey recent developments in granular models and machine learning models for loss reserving, and to compare the two families with a view to assessment of their potential for future development. This is best understood against the context of the evolution of these models from their predecessors, and the early sections recount relevant archaeological vignettes from the history of loss reserving. However, the larger part of the paper is concerned with the granular models and machine learning models. Their relative merits are discussed, as are the factors governing the choice between them and the older, more primitive models. Concluding sections briefly consider the possible further development of these models in the future.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2019
    Description: It is intuitive that proximity to hospitals can only improve the chances of survival from a range of medical conditions. This study examines the empirical evidence for this assertion, based on Australian data. While hospital proximity might serve as a proxy for other factors, such as indigenity, income, wealth or geography, the evidence suggests that proximity provides the most direct link to these factors. In addition, as it turns out, a very statistically significant one that transcends economies.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2019
    Description: We analyzed real telematics information for a sample of drivers with usage-based insurance policies. We examined the statistical distribution of distance driven above the posted speed limit—which presents a strong positive asymmetry—using quantile regression models. We found that, at different percentile levels, the distance driven at speeds above the posted limit depends on total distance driven and, more generally, on factors such as the percentage of urban and nighttime driving and on the driver’s gender. However, the impact of these covariates differs according to the percentile level. We stress the importance of understanding telematics information, which should not be limited to simply characterizing average drivers, but can be useful for signaling dangerous driving by predicting quantiles associated with specific driver characteristics. We conclude that the risk of driving for long distances above the speed limit is heterogeneous and, moreover, we show that prevention campaigns should target primarily male non-urban drivers, especially if they present a high percentage of nighttime driving.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2019
    Description: In this paper, we propose models for non-life loss reserving combining traditional approaches such as Mack’s or generalized linear models and gradient boosting algorithm in an individual framework. These claim-level models use information about each of the payments made for each of the claims in the portfolio, as well as characteristics of the insured. We provide an example based on a detailed dataset from a property and casualty insurance company. We contrast some traditional aggregate techniques, at the portfolio-level, with our individual-level approach and we discuss some points related to practical applications.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2019
    Description: The aim of this study was to investigate whether firms’ reporting delays are interconnected with bankruptcy risk and its financial determinants. This study was based on 698,189 firm-year observations from Estonia. Annual report submission delay, either in a binary or ordinal form, was used as the dependent variable, while bankruptcy risk based on an international model or the financial ratios determining it were the independent variables. The findings indicated that firms with lower values of liquidity and annual and accumulated profitability were more likely to delay the submission of an annual report over the legal deadline. In turn, firm leverage was not interconnected with reporting delays. In addition, firms with a higher risk of bankruptcy were more likely to delay the submission of their annual reports. Firms with different ages, sizes and industries varied in respect to the obtained results. Different stakeholders should be aware that when reporting delays occur, these can be conditioned by higher bankruptcy risk or poor performance, and thus, for instance, crediting such firms should be treated with caution. State institutions controlling timely submission should take strict(er) measures in cases of firms delaying for a lengthy period.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2019
    Description: We propose an alternative approach to the modeling of the positive dependence between the probability of default and the loss given default in a portfolio of exposures, using a bivariate urn process. The model combines the power of Bayesian nonparametrics and statistical learning, allowing for the elicitation and the exploitation of experts’ judgements, and for the constant update of this information over time, every time new data are available. A real-world application on mortgages is described using the Single Family Loan-Level Dataset by Freddie Mac.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2019
    Description: In this paper, the generalized Pareto distribution (GPD) copula approach is utilized to solve the conditional value-at-risk (CVaR) portfolio problem. Particularly, this approach used (i) copula to model the complete linear and non-linear correlation dependence structure, (ii) Pareto tails to capture the estimates of the parametric Pareto lower tail, the non-parametric kernel-smoothed interior and the parametric Pareto upper tail and (iii) Value-at-Risk (VaR) to quantify risk measure. The simulated sample covers the G7, BRICS (association of Brazil, Russia, India, China and South Africa) and 14 popular emerging stock-market returns for the period between 1997 and 2018. Our results suggest that the efficient frontier with the minimizing CVaR measure and simulated copula returns combined outperforms the risk/return of domestic portfolios, such as the US stock market. This result improves international diversification at the global level. We also show that the Gaussian and t-copula simulated returns give very similar but not identical results. Furthermore, the copula simulation provides more accurate market-risk estimates than historical simulation. Finally, the results support the notion that G7 countries can provide an important opportunity for diversification. These results are important to investors and policymakers.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2019
    Description: We consider de Finetti’s stochastic control problem when the (controlled) process is allowed to spend time under the critical level. More precisely, we consider a generalized version of this control problem in a spectrally negative Lévy model with exponential Parisian ruin. We show that, under mild assumptions on the Lévy measure, an optimal strategy is formed by a barrier strategy and that this optimal barrier level is always less than the optimal barrier level when classical ruin is implemented. In addition, we give necessary and sufficient conditions for the barrier strategy at level zero to be optimal.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2019
    Description: The Hierarchical risk parity (HRP) approach of portfolio allocation, introduced by Lopez de Prado (2016), applies graph theory and machine learning to build a diversified portfolio. Like the traditional risk-based allocation methods, HRP is also a function of the estimate of the covariance matrix, however, it does not require its invertibility. In this paper, we first study the impact of covariance misspecification on the performance of the different allocation methods. Next, we study under an appropriate covariance forecast model whether the machine learning based HRP outperforms the traditional risk-based portfolios. For our analysis, we use the test for superior predictive ability on out-of-sample portfolio performance, to determine whether the observed excess performance is significant or if it occurred by chance. We find that when the covariance estimates are crude, inverse volatility weighted portfolios are more robust, followed by the machine learning-based portfolios. Minimum variance and maximum diversification are most sensitive to covariance misspecification. HRP follows the middle ground; it is less sensitive to covariance misspecification when compared with minimum variance or maximum diversification portfolio, while it is not as robust as the inverse volatility weighed portfolio. We also study the impact of the different rebalancing horizon and how the portfolios compare against a market-capitalization weighted portfolio.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2019
    Description: One of the key components of counterparty credit risk (CCR) measurement is generating scenarios for the evolution of the underlying risk factors, such as interest and exchange rates, equity and commodity prices, and credit spreads. Geometric Brownian Motion (GBM) is a widely used method for modeling the evolution of exchange rates. An important limitation of GBM is that, due to the assumption of constant drift and volatility, stylized facts of financial time-series, such as volatility clustering and heavy-tailedness in the returns distribution, cannot be captured. We propose a model where volatility and drift are able to switch between regimes; more specifically, they are governed by an unobservable Markov chain. Hence, we model exchange rates with a hidden Markov model (HMM) and generate scenarios for counterparty exposure using this approach. A numerical study is carried out and backtesting results for a number of exchange rates are presented. The impact of using a regime-switching model on counterparty exposure is found to be profound for derivatives with non-linear payoffs.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2019
    Description: In the past two decades increasing computational power resulted in the development of more advanced claims reserving techniques, allowing the stochastic branch to overcome the deterministic methods, resulting in forecasts of enhanced quality. Hence, not only point estimates, but predictive distributions can be generated in order to forecast future claim amounts. The significant expansion in the variety of models requires the validation of these methods and the creation of supporting techniques for appropriate decision making. The present article compares and validates several existing and self-developed stochastic methods on actual data applying comparison measures in an algorithmic manner.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2019
    Description: This is Part III of a series of papers which focus on a general framework for portfolio theory. Here, we extend a general framework for portfolio theory in a one-period financial market as introduced in Part I [Maier-Paape and Zhu, Risks 2018, 6(2), 53] to multi-period markets. This extension is reasonable for applications. More importantly, we take a new approach, the “modular portfolio theory”, which is built from the interaction among four related modules: (a) multi period market model; (b) trading strategies; (c) risk and utility functions (performance criteria); and (d) the optimization problem (efficient frontier and efficient portfolio). An important concept that allows dealing with the more general framework discussed here is a trading strategy generating function. This concept limits the discussion to a special class of manageable trading strategies, which is still wide enough to cover many frequently used trading strategies, for instance “constant weight” (fixed fraction). As application, we discuss the utility function of compounded return and the risk measure of relative log drawdowns.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2019
    Description: Stochastic mortality models have been developed for a range of applications from demographic projections to financial management. Financial risk based models built on methods used for interest rates and apply these to mortality rates. They have the advantage of being applied to financial pricing and the management of longevity risk. Olivier and Jeffery (2004) and Smith (2005) proposed a model based on a forward-rate mortality framework with stochastic factors driven by univariate gamma random variables irrespective of age or duration. We assess and further develop this model. We generalize random shocks from a univariate gamma to a univariate Tweedie distribution and allow for the distributions to vary by age. Furthermore, since dependence between ages is an observed characteristic of mortality rate improvements, we formulate a multivariate framework using copulas. We find that dependence increases with age and introduce a suitable covariance structure, one that is related to the notion of ax minimum. The resulting model provides a more realistic basis for capturing the risk of mortality improvements and serves to enhance longevity risk management for pension and insurance funds.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2019
    Description: In this research, trade credit is analysed form a seller (supplier) perspective. Trade credit allows the supplier to increase sales and profits but creates the risk that the customer will not pay, and at the same time increases the risk of the supplier’s insolvency. If the supplier is a small or micro-enterprise (SMiE), it is usually an issue of human and technical resources. Therefore, when dealing with these issues, the supplier needs a high accuracy but simple and highly interpretable trade credit risk assessment model that allows for assessing the risk of insolvency of buyers (who are usually SMiE). The aim of the research is to create a statistical enterprise trade credit risk assessment (ETCRA) model for Lithuanian small and micro-enterprises (SMiE). In the empirical analysis, the financial and non-financial data of 734 small and micro-sized enterprises in the period of 2010–2012 were chosen as the samples. Based on the logistic regression, the ETCRA model was developed using financial and non-financial variables. In the ETCRA model, the enterprise’s financial performance is assessed from different perspectives: profitability, liquidity, solvency, and activity. Varied model variants have been created using (i) only financial ratios and (ii) financial ratios and non-financial variables. Moreover, the inclusion of non-financial variables in the model does not substantially improve the characteristics of the model. This means that the models that use only financial ratios can be used in practice, and the models that include non-financial variables can also be used. The designed models can be used by suppliers when making decisions of granting a trade credit for small or micro-enterprises.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2019
    Description: This work examines apportionment of multiplicative risks by considering three dominance orderings: first-degree stochastic dominance, Rothschild and Stiglitz’s increase in risk and downside risk increase. We use the relative nth-degree risk aversion measure and decreasing relative nth-degree risk aversion to provide conditions guaranteeing the preference for “harm disaggregation” of multiplicative risks. Further, we relate our conclusions to the preference toward bivariate lotteries, which interpret correlation-aversion, cross-prudence and cross-temperance.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2019
    Description: I document a sizeable bias that might arise when valuing out of the money American options via the Least Square Method proposed by Longstaff and Schwartz (2001). The key point of this algorithm is the regression-based estimate of the continuation value of an American option. If this regression is ill-posed, the procedure might deliver biased results. The price of the American option might even fall below the price of its European counterpart. For call options, this is likely to occur when the dividend yield of the underlying is high. This distortion is documented within the standard Black–Scholes–Merton model as well as within its most common extensions (the jump-diffusion, the stochastic volatility and the stochastic interest rates models). Finally, I propose two easy and effective workarounds that fix this distortion.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2019
    Description: The purpose of this paper is to evaluate and estimate market risk for the ten major industries in Vietnam. The focus of the empirical analysis is on the energy sector, which has been designated as one of the four key industries, together with services, food, and telecommunications, targeted for economic development by the Vietnam Government through to 2020. The oil and gas industry is a separate energy-related major industry, and it is evaluated separately from energy. The data set is from 2009 to 2017, which is decomposed into two distinct sub-periods after the Global Financial Crisis (GFC), namely the immediate post-GFC (2009–2011) period and the normal (2012–2017) period, in order to identify the behavior of market risk for Vietnam’s major industries. For the stock market in Vietnam, the website used in this paper provided complete and detailed data for each stock, as classified by industry. Two widely used approaches to measure and analyze risk are used in the empirical analysis, namely Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR). The empirical findings indicate that Energy and Pharmaceuticals are the least risky industries, whereas oil and gas and securities have the greatest risk. In general, there is strong empirical evidence that the four key industries display relatively low risk. For public policy, the Vietnam Government’s proactive emphasis on the targeted industries, including energy, to achieve sustainable economic growth and national economic development, seems to be working effectively. This paper presents striking empirical evidence that Vietnam’s industries have substantially improved their economic performance over the full sample, moving from relatively higher levels of market risk in the immediate post-GFC period to a lower risk environment in a normal period several years after the end of the calamitous GFC.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2019
    Description: Estimation of future mortality rates still plays a central role among life insurers in pricing their products and managing longevity risk. In the literature on mortality modeling, a wide number of stochastic models have been proposed, most of them forecasting future mortality rates by extrapolating one or more latent factors. The abundance of proposed models shows that forecasting future mortality from historical trends is non-trivial. Following the idea proposed in Deprez et al. (2017), we use machine learning algorithms, able to catch patterns that are not commonly identifiable, to calibrate a parameter (the machine learning estimator), improving the goodness of fit of standard stochastic mortality models. The machine learning estimator is then forecasted according to the Lee-Carter framework, allowing one to obtain a higher forecasting quality of the standard stochastic models. Out-of sample forecasts are provided to verify the model accuracy.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2019
    Description: We introduce a financial stress index that was developed by the Office of Financial Research (OFR FSI) and detail its purpose, construction, interpretation, and use in financial market monitoring. The index employs a novel and flexible methodology using daily data from global financial markets. Analysis for the 2000–2018 time period is presented. Using a logistic regression framework and dates of government intervention in the financial system as a proxy for stress events, we found that the OFR FSI performs well in identifying systemic financial stress. In addition, we find that the OFR FSI leads the Chicago Fed National Activity Index in a Granger causality analysis, suggesting that increases in financial stress help predict decreases in economic activity.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2019
    Description: This study examines the effect of market risk on the financial performance of 31 non-financial companies listed on the Casablanca Stock Exchange (CSE) over the period 2000–2016. We utilized three alternative variables to assess financial performance, namely, the return on assets, the return on equity and the profit margin. We used the degree of financial leverage, the book-to-market ratio, and the gearing ratio as the indicators of market risk. Then, we employed the pooled OLS model, the fixed effects model, the random effects model, the difference-GMM and the system-GMM models. The results show that the different measures of market risk have significant negative influences on the companies’ financial performance. The elasticities are greater following the degree of financial leverage compared with the book-to-market ratio and the gearing ratio. In most cases, the firm’s age, the cash holdings ratio, the firm’s size, the debt-to-assets ratio, and the tangibility ratio have positive effects on financial performance, whereas the debt-to-income ratio and the stock turnover hurt the performance of these non-financial companies. Therefore, decision-makers and managers should mitigate market risk through appropriate strategies of risk management, such as derivatives and insurance techniques.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2019
    Description: The management of National Social Security Systems is being challenged more and more by the rapid ageing of the population, especially in the industrialized countries. In order to chase the Pension System sustainability, several countries in Europe are setting up pension reforms linking the retirement age and/or benefits to life expectancy. In this context, the accurate modelling and projection of mortality rates and life expectancy play a central role and represent issues of great interest in recent literature. Our study refers to the Italian mortality experience and considers an indexing mechanism based on the expected residual life to adjust the retirement age and keep costs at an expected budgeted level, in the spirit of sharing the longevity risk between Social Security Systems and retirees. In order to combine fitting and projections performances of selected stochastic mortality models, a model assembling technique is applied to face uncertainty in model selection, while accounting for uncertainty of estimation as well. The resulting proposal is an averaged model that is suitable to discuss about the gender gap in longevity risk and its alleged narrowing over time.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2019
    Description: Extrapolative methods are one of the most commonly-adopted forecasting approaches in the literature on projecting future mortality rates. It can be argued that there are two types of mortality models using this approach. The first extracts patterns in age, time and cohort dimensions either in a deterministic fashion or a stochastic fashion. The second uses non-parametric smoothing techniques to model mortality and thus has no explicit constraints placed on the model. We argue that from a forecasting point of view, the main difference between the two types of models is whether they treat recent and historical information equally in the projection process. In this paper, we compare the forecasting performance of the two types of models using Great Britain male mortality data from 1950–2016. We also conduct a robustness test to see how sensitive the forecasts are to the changes in the length of historical data used to calibrate the models. The main conclusion from the study is that more recent information should be given more weight in the forecasting process as it has greater predictive power over historical information.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2019
    Description: Phase-type (PH) distributions are defined as distributions of lifetimes of finite continuous-time Markov processes. Their traditional applications are in queueing, insurance risk, and reliability, but more recently, also in finance and, though to a lesser extent, to life and health insurance. The advantage is that PH distributions form a dense class and that problems having explicit solutions for exponential distributions typically become computationally tractable under PH assumptions. In the first part of this paper, fitting of PH distributions to human lifetimes is considered. The class of generalized Coxian distributions is given special attention. In part, some new software is developed. In the second part, pricing of life insurance products such as guaranteed minimum death benefit and high-water benefit is treated for the case where the lifetime distribution is approximated by a PH distribution and the underlying asset price process is described by a jump diffusion with PH jumps. The expressions are typically explicit in terms of matrix-exponentials involving two matrices closely related to the Wiener-Hopf factorization, for which recently, a Lévy process version has been developed for a PH horizon. The computational power of the method of the approach is illustrated via a number of numerical examples.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2019
    Description: Standardized longevity risk transfers often involve modeling mortality rates of multiple populations. Some researchers have found that mortality indexes of selected countries are cointegrated, meaning that a linear relationship exists between the indexes. Vector error correction model (VECM) was used to incorporate this relation, thereby forcing the mortality rates of multiple populations to revert to a long-run equilibrium. However, the long-run equilibrium may change over time. It is crucial to incorporate these changes such that mortality dependence is adequately modeled. In this paper, we develop a framework to examine the presence of equilibrium changes and to incorporate these changes into the mortality model. In particular, we focus on equilibrium changes caused by threshold effect, the phenomenon that mortality indexes alternate between different VECMs depending on the value of a threshold variable. Our framework comprises two steps. In the first step, a statistical test is performed to examine the presence of threshold effect in the VECM for multiple mortality indexes. In the second step, threshold vector error correction model (TVECM) is fitted to the mortality indexes and model adequacy is evaluated. We illustrate this framework with the mortality data of England and Wales (EW) and Canadian populations. We further apply the TVECM to forecast future mortalities and price an illustrative longevity bond with multivariate Wang transform. Our numerical results show that TVECM predicted much faster mortality improvement for EW and Canada than single-regime VECM and thus the incorporation of threshold effect significant increases longevity bond price.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2019
    Description: We consider the optimal bail-out dividend problem with fixed transaction cost for a Lévy risk model with a constraint on the expected present value of injected capital. To solve this problem, we first consider the optimal bail-out dividend problem with transaction cost and capital injection and show the optimality of reflected ( c 1 , c 2 ) -policies. We then find the optimal Lagrange multiplier, by showing that in the dual Lagrangian problem the complementary slackness conditions are met. Finally, we present some numerical examples to support our results.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2019
    Description: Risk perception is an idiosyncratic process of interpretation. It is a highly personal process of making a decision based on an individual’s frame of reference that has evolved over time. The purpose of this paper is to find out the risk perception level of equity investors and to identify the factors influencing their risk perception. The study was conducted using a stratified random sampling design of 358 investors. It was found that the overall risk perception level of equity investors is moderate and that the main factors affecting their risk perception are information screening, investment education, fear psychosis, fundamental expertise, technical expertise, familiarity bias, information asymmetry, understanding of the market, etc. Considering the above findings, efforts should be made to bring people with a high risk perception to the low risk perception category by providing them with training to handle or manage high-risk scenarios which will help in promoting an equity-investment culture.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2019
    Description: In this paper, we suggest a Bayesian multivariate approach for pricing a reverse mortgage, allowing for house price risk, interest rate risk and longevity risk. We adopt the principle of maximum entropy in risk-neutralisation of these three risk components simultaneously. Our numerical results based on Australian data suggest that a reverse mortgage would be financially sustainable under the current financial environment and the model settings and assumptions.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2019
    Description: ‘Sustainable investment’—includes a variety of asset classes selected while caring for the causes of environmental, social, and governance (ESG). It is an investment strategy that seeks to combine social and/ or environmental benefits with financial returns, thus linking investor’s social, ethical, ecological and economic concerns Under certain conditions, these indices also help to attract foreign capital, seeking international participation in the local capital markets. The purpose of this paper is to study whether the sustainable investment alternatives offer better financial returns than the conventional indices from both developed and emerging markets. With an intent to maintain consistency, this paper comparatively analyzes the financial returns of the Thomson Reuters/S-Network global indices, namely the developed markets (excluding US) ESG index—TRESGDX, emerging markets ESG index—TRESGEX, US large-cap ESG index—TRESGUS, Europe ESG index—TRESGEU, and those of the usual markets, namely MSCI world index (MSCI W), MSCI All Country World Equity index (MSCI ACWI), MSCI USA index (MSCI USA), and MSCI Europe Australasia Far East index (MSCI EAFE), MSCI Emerging Markets index (MSCI EM) and MSCI Europe index (MSCI EU). The study also focusses on the inter-linkages between these indices. Daily closing prices of all the benchmark indices are taken for the five-year period of January 2013–December 2017. Line charts and unit-root tests are applied to check the stationary nature of the series; Granger’s causality model, auto-regressive conditional heteroskedasticity (ARCH)-GARCH type modelling is performed to find out the linkages between the markets under study followed by the Johansen’s cointegration test and the Vector Error Correction Model to test the volatility spillover between the sustainable indices and the conventional indices. The study finds that the sustainable indices and the conventional indices are integrated and there is a flow of information between the two investment avenues. The results indicate that there is no significant difference in the performance between sustainable indices and the traditional conventional indices, being a good substitute to the latter. Hence, the financial/investment managers can obtain more insights regarding investment decisions, and the study further suggests that their portfolios should consider both the indices with the perspective of diversifying the risk and hedging, and reap benefits of the same. Additionally, corporate executives shall use it to benchmark their own performance against peers and track news as well.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2019
    Description: Risk management is a comparatively new field and there is no core system of risk management in the construction industries of developing countries. In Pakistan, construction is an extremely risk-seeking industry lacking a good reputation for handling risk. However, it is gradually giving it more importance as a result of increased competition and construction activities. For this purpose, a survey-based study has been conducted which aims to investigate the risk management practices used in construction projects in Pakistan. To achieve the objective, data was collected from 22 contractor firms working on 100 diverse projects. The analysis indicates that risk management has been implemented at a low level in the local environment. The results also disclose that there is a higher degree of correlation between effective risk management and project success. The findings reveal the importance of risk management techniques, their usage, implication, and the effect of these techniques on the success of construction projects from the contractor’s perspective, thus convincing the key participants of projects about the use of risk management.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2019
    Description: Determining distributions of the functions of random variables is one of the most important problems in statistics and applied mathematics because distributions of functions have wide range of applications in numerous areas in economics, finance, risk management, science, and others. However, most studies only focus on the distribution of independent variables or focus on some common distributions such as multivariate normal joint distributions for the functions of dependent random variables. To bridge the gap in the literature, in this paper, we first derive the general formulas to determine both density and distribution of the product for two or more random variables via copulas to capture the dependence structures among the variables. We then propose an approach combining Monte Carlo algorithm, graphical approach, and numerical analysis to efficiently estimate both density and distribution. We illustrate our approach by examining the shapes and behaviors of both density and distribution of the product for two log-normal random variables on several different copulas, including Gaussian, Student-t, Clayton, Gumbel, Frank, and Joe Copulas, and estimate some common measures including Kendall’s coefficient, mean, median, standard deviation, skewness, and kurtosis for the distributions. We found that different types of copulas affect the behavior of distributions differently. In addition, we also discuss the behaviors via all copulas above with the same Kendall’s coefficient. Our results are the foundation of any further study that relies on the density and cumulative probability functions of product for two or more random variables. Thus, the theory developed in this paper is useful for academics, practitioners, and policy makers.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2019
    Description: As is well-known, the benefit of restricting Lévy processes without positive jumps is the “ W , Z scale functions paradigm”, by which the knowledge of the scale functions W , Z extends immediately to other risk control problems. The same is true largely for strong Markov processes X t , with the notable distinctions that (a) it is more convenient to use as “basis” differential exit functions ν , δ , and that (b) it is not yet known how to compute ν , δ or W , Z beyond the Lévy, diffusion, and a few other cases. The unifying framework outlined in this paper suggests, however, via an example that the spectrally negative Markov and Lévy cases are very similar (except for the level of work involved in computing the basic functions ν , δ ). We illustrate the potential of the unified framework by introducing a new objective (33) for the optimization of dividends, inspired by the de Finetti problem of maximizing expected discounted cumulative dividends until ruin, where we replace ruin with an optimally chosen Azema-Yor/generalized draw-down/regret/trailing stopping time. This is defined as a hitting time of the “draw-down” process Y t = sup 0 ≤ s ≤ t X s − X t obtained by reflecting X t at its maximum. This new variational problem has been solved in a parallel paper.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2019
    Description: This paper proposes a data-driven approach, by means of an Artificial Neural Network (ANN), to value financial options and to calculate implied volatilities with the aim of accelerating the corresponding numerical methods. With ANNs being universal function approximators, this method trains an optimized ANN on a data set generated by a sophisticated financial model, and runs the trained ANN as an agent of the original solver in a fast and efficient way. We test this approach on three different types of solvers, including the analytic solution for the Black-Scholes equation, the COS method for the Heston stochastic volatility model and Brent’s iterative root-finding method for the calculation of implied volatilities. The numerical results show that the ANN solver can reduce the computing time significantly.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2019
    Description: The aim of this paper is to construct prospective life tables adapted to the experience ofAlgerian retirees. Mortality data of the retired population are only available for the age interval[45,95[ and for the period from 2004 to 2013. The use of the conventional prospective mortalitymodels is not supposed to provide robust forecasts given data limitation in terms of eitherexposure to death risk or data length. To improve forecasting robustness, we use the globalpopulation mortality as an external reference. The adjustment of the experience mortality on thereference allows projecting the age-specific death rates calculated based on the experience of theretired population. We propose a generalized version of the Brass-type relation modelincorporating a quadratic effect to perform the adjustment. Results show no significant differencefor men, either retired or not, but reveal a gap of over three years in the remaining life expectancyat age 50 in favor of retired women compared to those of the global population.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2019
    Description: A statistical inference for ruin probability from a certain discrete sample of the surplus is discussed under a spectrally negative Lévy insurance risk. We consider the Laguerre series expansion of ruin probability, and provide an estimator for any of its partial sums by computing the coefficients of the expansion. We show that the proposed estimator is asymptotically normal and consistent with the optimal rate of convergence and estimable asymptotic variance. This estimator enables not only a point estimation of ruin probability but also an approximated interval estimation and testing hypothesis.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2019
    Description: Contingent Convertible (CoCo) is a hybrid debt issued by banks with a specific feature forcing its conversion to equity in the event of the bank’s financial distress. CoCo carries two major risks: the risk of default, which threatens any type of debt instrument, plus the exclusive risk of mandatory conversion. In this paper, we propose a model to value CoCo debt instruments as a function of the debt ratio. Although the CoCo is a more expensive instrument than traditional debt, its presence in the capital structure lowers the cost of ordinary debt and reduces the total cost of debt. For preliminary equity holders, the presence of CoCo in the bank’s capital structure increases the shareholder’s aggregate value.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2019
    Description: Territory design and analysis using geographical loss cost are a key aspect in auto insurance rate regulation. The major objective of this work is to study the design of geographical rating territories by maximizing the within-group homogeneity, as well as maximizing the among-group heterogeneity from statistical perspectives, while maximizing the actuarial equity of pure premium, as required by insurance regulation. To achieve this goal, the spatially-constrained clustering of industry level loss cost was investigated. Within this study, in order to meet the contiguity, which is a legal requirement on the design of geographical rating territories, a clustering approach based on Delaunay triangulation is proposed. Furthermore, an entropy-based approach was introduced to quantify the homogeneity of clusters, while both the elbow method and the gap statistic are used to determine the initial number of clusters. This study illustrated the usefulness of the spatially-constrained clustering approach in defining geographical rating territories for insurance rate regulation purposes. The significance of this work is to provide a new solution for better designing geographical rating territories. The proposed method can be useful for other demographical data analysis because of the similar nature of the spatial constraint.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2019
    Description: The presented research discusses general approaches to analyze and model healthcare data at the treatment level and at the store level. The paper consists of two parts: (1) a general analysis method for store-level product sales of an organization and (2) a treatment-level analysis method of healthcare expenditures. In the first part, our goal is to develop a modeling framework to help understand the factors influencing the sales volume of stores maintained by a healthcare organization. In the second part of the paper, we demonstrate a treatment-level approach to modeling healthcare expenditures. In this part, we aim to improve the operational-level management of a healthcare provider by predicting the total cost of medical services. From this perspective, treatment-level analyses of medical expenditures may help provide a micro-level approach to predicting the total amount of expenditures for a healthcare provider. We present a model for analyzing a specific type of medical data, which may arise commonly in a healthcare provider’s standardized database. We do this by using an extension of the frequency-severity approach to modeling insurance expenditures from the actuarial science literature.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2019
    Description: In this paper, we propose a credible regression approach with random coefficients to model and forecast the mortality dynamics of a given population with limited data. Age-specific mortality rates are modelled and extrapolation methods are utilized to estimate future mortality rates. The results on Greek mortality data indicate that credibility regression contributed to more accurate forecasts than those produced from the Lee–Carter and Cairns–Blake–Dowd models. An application on pricing insurance-related products is also provided.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2019
    Description: We study a portfolio selection problem in a continuous-time Itô–Markov additive market with prices of financial assets described by Markov additive processes that combine Lévy processes and regime switching models. Thus, the model takes into account two sources of risk: the jump diffusion risk and the regime switching risk. For this reason, the market is incomplete. We complete the market by enlarging it with the use of a set of Markovian jump securities, Markovian power-jump securities and impulse regime switching securities. Moreover, we give conditions under which the market is asymptotic-arbitrage-free. We solve the portfolio selection problem in the Itô–Markov additive market for the power utility and the logarithmic utility.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2019
    Description: There is an increasing influence of machine learning in business applications, with many solutions already implemented and many more being explored. Since the global financial crisis, risk management in banks has gained more prominence, and there has been a constant focus around how risks are being detected, measured, reported and managed. Considerable research in academia and industry has focused on the developments in banking and risk management and the current and emerging challenges. This paper, through a review of the available literature seeks to analyse and evaluate machine-learning techniques that have been researched in the context of banking risk management, and to identify areas or problems in risk management that have been inadequately explored and are potential areas for further research. The review has shown that the application of machine learning in the management of banking risks such as credit risk, market risk, operational risk and liquidity risk has been explored; however, it doesn’t appear commensurate with the current industry level of focus on both risk management and machine learning. A large number of areas remain in bank risk management that could significantly benefit from the study of how machine learning can be applied to address specific problems.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2019
    Description: In this paper, we develop a contingent claim model to evaluate the equity, default risk, and efficiency gain/loss from managerial overconfidence of a shadow-banking life insurer under the purchases of distressed assets by the government. Our paper focuses on managerial overconfidence where the chief executive officer (CEO) overestimates the returns on investment. The investment market faced by the life insurer is imperfectly competitive, and investment is core to the provision of profit-sharing life insurance policies. We show that CEO overconfidence raises the default risk in the life insurer’s equity returns, thereby adversely affecting the financial stability. Either shadow-banking involvement or government bailout attenuates the unfavorable effect. There is an efficiency gain from CEO overconfidence to investment. Government bailout helps to reduce the life insurer’s default risk, but simultaneously reduce the efficiency gain from CEO overconfidence. Our results contribute to the managerial overconfidence literature linking insurer shadow-banking involvement and government bailout in particular during a financial crisis.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2019
    Description: One of the main challenges investors have to face is model uncertainty. Typically, the dynamic of the assets is modeled using two parameters: the drift vector and the covariance matrix, which are both uncertain. Since the variance/covariance parameter is assumed to be estimated with a certain level of confidence, we focus on drift uncertainty in this paper. Building on filtering techniques and learning methods, we use a Bayesian learning approach to solve the Markowitz problem and provide a simple and practical procedure to implement optimal strategy. To illustrate the value added of using the optimal Bayesian learning strategy, we compare it with an optimal nonlearning strategy that keeps the drift constant at all times. In order to emphasize the prevalence of the Bayesian learning strategy above the nonlearning one in different situations, we experiment three different investment universes: indices of various asset classes, currencies and smart beta strategies.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2019
    Description: Life Insurance Retirement Plans (LIRPs) offer tax-deferred cash value accumulation, tax-free withdrawals (if properly structured), and a tax-free death benefit to beneficiaries. Thus, LIRPs share many of the tax advantages of other retirement savings vehicles but with less restrictive limitations on income and contributions. Opinions are mixed about the effectiveness of LIRPs; some financial advisers recommend them enthusiastically, while others are more skeptical. In this paper, we examine the potential of LIRPs to meet both income and bequest needs in retirement. We contrast retirement portfolios that include a LIRP with those that include only investment products with no life insurance. We consider different issue ages, face amounts, and withdrawal patterns. We simulate market scenarios and we demonstrate that portfolios that include LIRPs yield higher legacy potential and smaller income risk than those that exclude it. Thus, we conclude that the inclusion of a LIRP can improve financial outcomes in retirement.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2019
    Description: In this paper, we employ 99% intraday value-at-risk (VaR) and intraday expected shortfall (ES) as risk metrics to assess the competency of the Multiplicative Component Generalised Autoregressive Heteroskedasticity (MC-GARCH) models based on the 1-min EUR/USD exchange rate returns. Five distributional assumptions for the innovation process are used to analyse their effects on the modelling and forecasting performance. The high-frequency volatility models were validated in terms of in-sample fit based on various statistical and graphical tests. A more rigorous validation procedure involves testing the predictive power of the models. Therefore, three backtesting procedures were used for the VaR, namely, the Kupiec’s test, a duration-based backtest, and an asymmetric VaR loss function. Similarly, three backtests were employed for the ES: a regression-based backtesting procedure, the Exceedance Residual backtest and the V-Tests. The validation results show that non-normal distributions are best suited for both model fitting and forecasting. The MC-GARCH(1,1) model under the Generalised Error Distribution (GED) innovation assumption gave the best fit to the intraday data and gave the best results for the ES forecasts. However, the asymmetric Skewed Student’s-t distribution for the innovation process provided the best results for the VaR forecasts. This paper presents the results of the first empirical study (to the best of the authors’ knowledge) in: (1) forecasting the intraday Expected Shortfall (ES) under different distributional assumptions for the MC-GARCH model; (2) assessing the MC-GARCH model under the Generalised Error Distribution (GED) innovation; (3) evaluating and ranking the VaR predictability of the MC-GARCH models using an asymmetric loss function.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2019
    Description: We use Object Oriented Bayesian Networks (OOBNs) to analyze complex ties in the equity market and to detect drivers for the Standard & Poor’s 500 (S&P 500) index. To such aim, we consider a vast number of indicators drawn from various investment areas (Value, Growth, Sentiment, Momentum, and Technical Analysis), and, with the aid of OOBNs, we study the role they played along time in influencing the dynamics of the S&P 500. Our results highlight that the centrality of the indicators varies in time, and offer a starting point for further inquiries devoted to combine OOBNs with trading platforms.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2019
    Description: Based on a rich dataset of recoveries donated by a debt collection business, recovery rates for non-performing loans taken from a single European country are modelled using linear regression, linear regression with Lasso, beta regression and inflated beta regression. We also propose a two-stage model: beta mixture model combined with a logistic regression model. The proposed model allowed us to model the multimodal distribution we found for these recovery rates. All models were built using loan characteristics, default data and collections data prior to purchase by the debt collection business. The intended use of the models was to estimate future recovery rates for improved risk assessment, capital requirement calculations and bad debt management. They were compared using a range of quantitative performance measures under K-fold cross validation. Among all the models, we found that the proposed two-stage beta mixture model performs best.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2019
    Description: Solvency II requirements introduced new issues for actuarial risk management in non-life insurance, challenging the market to have a consciousness of its own risk profile, and also investigating the sensitivity of the solvency ratio depending on the insurance risks and technical results on either a short-term and medium-term perspective. For this aim, in the present paper, a partial internal model for premium risk is developed for three multi-line non-life insurers, and the impact of some different business mixes is analyzed. Furthermore, the risk-mitigation and profitability impact of reinsurance in the premium risk model are introduced, and a global framework for a feasible application of this model consistent with a medium-term analysis is provided. Numerical results are also figured out with evidence of various effects for several portfolios and reinsurance arrangements, pointing out the main reasons for these differences.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2019
    Description: I derive practical formulas for optimal arrangements between sophisticated stock market investors (continuous-time Kelly gamblers or, more generally, CRRA investors) and the brokers who lend them cash for leveraged bets on a high Sharpe asset (i.e., the market portfolio). Rather than, say, the broker posting a monopoly price for margin loans, the gambler agrees to use a greater quantity of margin debt than he otherwise would in exchange for an interest rate that is lower than the broker would otherwise post. The gambler thereby attains a higher asymptotic capital growth rate and the broker enjoys a greater rate of intermediation profit than would be obtained under non-cooperation. If the threat point represents a complete breakdown of negotiations (resulting in zero margin loans), then we get an elegant rule of thumb: r L * = 3 / 4 r + 1 / 4 ν − σ 2 / 2 , where r is the broker’s cost of funds, ν is the compound-annual growth rate of the market index, and σ is the annual volatility. We show that, regardless of the particular threat point, the gambler will negotiate to size his bets as if he himself could borrow at the broker’s call rate.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2019
    Description: Credit default swap (CDS) spreads measure the default risk of the reference entity and have been frequently used in recent empirical papers. To provide a rigorous econometrics foundation for empirical CDS analysis, this paper applies the augmented Dickey–Fuller, Phillips–Perron, Kwiatkowski–Phillips–Schmidt–Shin, and Ng–Perron tests to study the unit root property of CDS spreads, and it uses the Phillips–Ouliaris–Hansen tests to determine whether they are cointegrated. The empirical sample consists of daily CDS spreads of the six large U.S. banks from 2001 to 2018. The main findings are that it is log, not raw, CDS spreads that are unit root processes, and that log CDS spreads are cointegrated. These findings imply that, even though the risks of individual banks may deviate from each other in the short run, there is a long-run relation that ties them together. As these CDS spreads are an important input for financial systemic risk, there are at least two policy implications. First, in monitoring systemic risk, policymakers should focus on long-run trends rather than short-run fluctuations of CDS spreads. Second, in controlling systemic risk, policy measures that reduce the long-run risks of individual banks, such as stress testing and capital buffers, are helpful in mitigating overall systemic risk.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2019
    Description: This paper provides a critical analysis of the subadditivity axiom, which is the key condition for coherent risk measures. Contrary to the subadditivity assumption, bank mergers can create extra risk. We begin with an analysis how a merger affects depositors, junior or senior bank creditors, and bank owners. Next it is shown that bank mergers can result in higher payouts having to be made by the deposit insurance scheme. Finally, we demonstrate that if banks are interconnected via interbank loans, a bank merger could lead to additional contagion risks. We conclude that the subadditivity assumption should be rejected, since a subadditive risk measure, by definition, cannot account for such increased risks.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2019
    Description: Long-term care (LTC) encompasses a set of services provided to impaired and dependent elderly people. To assess the level of the dependence several scales are used, including activities of daily living (ADL), instrumental ADL (IADL) and functional limitations. Once an elderly person fails to perform these activities independently, he or she requires special assistance. Help can be provided as informal care by relatives and as formal care by professionals. The aim of this research is to study individual characteristics that relate to the demand of LTC and to analyze the relation between formal and informal care. We base our study on data from the Swiss Health Survey focusing on respondents aged over 65 years. Using the structural equation modeling technique, we develop a statistical model that considers the dependence concept as a latent variable. This hidden dependence variable combines three indices linked to the limitations in ADL, in IADL and functional limitations. Accounting for causality links between covariates enables us to include the indirect effect of pathologies on the receipt of LTC mediated via dependence. In our model, we do not assume a causal relationship between formal and informal care. From our results, we observe a significant impact of pathologies as well as of the socio-demographic factors on the demand for LTC. The relationship between formal and informal care is found to be of both a complementary and substitutional nature.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2019
    Description: This paper examines the bank liquidity risk while using a maturity mismatch indicator of loans and deposits (LTDm) during a specific period. Core banking activities that are based on the process of maturity transformation are the most exposed to liquidity risk. The financial crisis in 2007–2009 highlighted the importance of liquidity to the functioning of both the financial markets and the banking sector. We investigate how characteristics of a bank, such as size, capital, and business model, are related to liquidity risk, while using a sample of European banks in the period after the financial crisis, from 2011 to 2017. While employing a generalized method of moment two-step estimator, we find that the banking size increases the liquidity risk, whereas capital is not an effective deterrent. Moreover, our findings reveal that, for savings banks, income diversification raises the liquidity risk while investment banks reliant on non-deposit funding decrease the exposure to liquidity risk.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2019
    Description: This paper considers fundamental questions of arbitrage pricing that arises when the uncertainty model incorporates ambiguity about risk. This additional ambiguity motivates a new principle of risk- and ambiguity-neutral valuation as an extension of the paper by Ross (1976) (Ross, Stephen A. 1976. The arbitrage theory of capital asset pricing. Journal of Economic Theory 13: 341–60). In the spirit of Harrison and Kreps (1979) (Harrison, J. Michael, and David M. Kreps. 1979. Martingales and arbitrage in multiperiod securities markets. Journal of Economic Theory 20: 381–408), the paper establishes a micro-economic foundation of viability in which ambiguity-neutrality imposes a fair-pricing principle via symmetric multiple prior martingales. The resulting equivalent symmetric martingale measure set exists if the uncertain volatility in asset prices is driven by an ambiguous Brownian motion.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2019
    Description: We propose a novel approach for loss reserving based on deep neural networks. The approach allows for joint modeling of paid losses and claims outstanding, and incorporation of heterogeneous inputs. We validate the models on loss reserving data across lines of business, and show that they improve on the predictive accuracy of existing stochastic methods. The models require minimal feature engineering and expert input, and can be automated to produce forecasts more frequently than manual workflows.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2019
    Description: In this paper, a new heavy-tailed distribution, the mixture Pareto-loggamma distribution, derived through an exponential transformation of the generalized Lindley distribution is introduced. The resulting model is expressed as a convex sum of the classical Pareto and a special case of the loggamma distribution. A comprehensive exploration of its statistical properties and theoretical results related to insurance are provided. Estimation is performed by using the method of log-moments and maximum likelihood. Also, as the modal value of this distribution is expressed in closed-form, composite parametric models are easily obtained by a mode matching procedure. The performance of both the mixture Pareto-loggamma distribution and composite models are tested by employing different claims datasets.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2019
    Description: The beta coefficient plays a crucial role in finance as a risk measure of a portfolio in comparison to the benchmark portfolio. In the paper, we investigate statistical properties of the sample estimator for the beta coefficient. Assuming that both the holding portfolio and the benchmark portfolio consist of the same assets whose returns are multivariate normally distributed, we provide the finite sample and the asymptotic distributions of the sample estimator for the beta coefficient. These findings are used to derive a statistical test for the beta coefficient and to construct a confidence interval for the beta coefficient. Moreover, we show that the sample estimator is an unbiased estimator for the beta coefficient. The theoretical results are implemented in an empirical study.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2019
    Description: Risk models developed on one dataset are often applied to new data and, in such cases, it is prudent to check that the model is suitable for the new data. An important application is in the banking industry, where statistical models are applied to loans to determine provisions and capital requirements. These models are developed on historical data, and regulations require their monitoring to ensure they remain valid on current portfolios—often years since the models were developed. The Population Stability Index (PSI) is an industry standard to measure whether the distribution of the current data has shifted significantly from the distribution of data used to develop the model. This paper explores several disadvantages of the PSI and proposes the Prediction Accuracy Index (PAI) as an alternative. The superior properties and interpretation of the PAI are discussed and it is concluded that the PAI can more accurately summarise the level of population stability, helping risk analysts and managers determine whether the model remains fit-for-purpose.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2019
    Description: An accurate assessment of the risk of extreme environmental events is of great importance for populations, authorities and the banking/insurance/reinsurance industry. Koch (2017) introduced a notion of spatial risk measure and a corresponding set of axioms which are well suited to analyze the risk due to events having a spatial extent, precisely such as environmental phenomena. The axiom of asymptotic spatial homogeneity is of particular interest since it allows one to quantify the rate of spatial diversification when the region under consideration becomes large. In this paper, we first investigate the general concepts of spatial risk measures and corresponding axioms further and thoroughly explain the usefulness of this theory for both actuarial science and practice. Second, in the case of a general cost field, we give sufficient conditions such that spatial risk measures associated with expectation, variance, value-at-risk as well as expected shortfall and induced by this cost field satisfy the axioms of asymptotic spatial homogeneity of order 0, − 2 , − 1 and − 1 , respectively. Last but not least, in the case where the cost field is a function of a max-stable random field, we provide conditions on both the function and the max-stable field ensuring the latter properties. Max-stable random fields are relevant when assessing the risk of extreme events since they appear as a natural extension of multivariate extreme-value theory to the level of random fields. Overall, this paper improves our understanding of spatial risk measures as well as of their properties with respect to the space variable and generalizes many results obtained in Koch (2017).
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2019
    Description: Banks make profits from the difference between short-term and long-term loan interest rates. To issue loans, banks raise funds from capital markets. Since the long-term loan rate is relatively stable, but short-term interest is usually variable, there is an interest rate risk. Therefore, banks need information about the optimal leverage strategies based on the current economic situation. Recent studies on the economic crisis by many economists showed that the crisis was due to too much leveraging by “big banks”. This leveraging turns out to be close to Kelly’s optimal point. It is known that Kelly’s strategy does not address risk adequately. We used the return–drawdown ratio and inflection point of Kelly’s cumulative return curve in a finite investment horizon to derive more conservative leverage levels. Moreover, we carried out a sensitivity analysis to determine strategies during a period of interest rates increase, which is the most important and risky period to leverage. Thus, we brought theoretical results closer to practical applications. Furthermore, by using the sensitivity analysis method, banks can change the allocation sizes to loans with different maturities to mediate the risks corresponding to different monetary policy environments. This provides bank managers flexible tools in mitigating risk.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2019
    Description: Two insurance companies I 1 , I 2 with reserves R 1 ( t ) , R 2 ( t ) compete for customers, such that in a suitable differential game the smaller company I 2 with R 2 ( 0 ) 〈 R 1 ( 0 ) aims at minimizing R 1 ( t ) − R 2 ( t ) by using the premium p 2 as control and the larger I 1 at maximizing by using p 1 . Deductibles K 1 , K 2 are fixed but may be different. If K 1 〉 K 2 and I 2 is the leader choosing its premium first, conditions for Stackelberg equilibrium are established. For gamma-distributed rates of claim arrivals, explicit equilibrium premiums are obtained, and shown to depend on the running reserve difference. The analysis is based on the diffusion approximation to a standard Cramér-Lundberg risk process extended to allow investment in a risk-free asset.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2019
    Description: We study the optimal excess-of-loss reinsurance problem when both the intensity of the claims arrival process and the claim size distribution are influenced by an exogenous stochastic factor. We assume that the insurer’s surplus is governed by a marked point process with dual-predictable projection affected by an environmental factor and that the insurance company can borrow and invest money at a constant real-valued risk-free interest rate r. Our model allows for stochastic risk premia, which take into account risk fluctuations. Using stochastic control theory based on the Hamilton-Jacobi-Bellman equation, we analyze the optimal reinsurance strategy under the criterion of maximizing the expected exponential utility of the terminal wealth. A verification theorem for the value function in terms of classical solutions of a backward partial differential equation is provided. Finally, some numerical results are discussed.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2019
    Description: We explore the Monte Carlo steps required to reduce the sampling error of the estimated 99.9% quantile within an acceptable threshold. Our research is of primary interest to practitioners working in the area of operational risk measurement, where the annual loss distribution cannot be analytically determined in advance. Usually, the frequency and the severity distributions should be adequately combined and elaborated with Monte Carlo methods, in order to estimate the loss distributions and risk measures. Naturally, financial analysts and regulators are interested in mitigating sampling errors, as prescribed in EU Regulation 2018/959. In particular, the sampling error of the 99.9% quantile is of paramount importance, along the lines of EU Regulation 575/2013. The Monte Carlo error for the operational risk measure is here assessed on the basis of the binomial distribution. Our approach is then applied to realistic simulated data, yielding a comparable precision of the estimate with a much lower computational effort, when compared to bootstrap, Monte Carlo repetition, and two other methods based on numerical optimization.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2019
    Description: Annuities providers become more and more exposed to longevity risk due to the increase in life expectancy. To hedge this risk, new longevity derivatives have been proposed (longevity bonds, q-forwards, S-swaps…). Although academic researchers, policy makers and practitioners have talked about it for years, longevity-linked securities are not widely traded in financial markets, due in particular to the pricing difficulty. In this paper, we compare different existing pricing methods and propose a Cost of Capital approach. Our method is designed to be more consistent with Solvency II requirement (longevity risk assessment is based on a one year time horizon). The price of longevity risk is determined for a S-forward and a S-swap but can be used to price other longevity-linked securities. We also compare this Cost of capital method with some classical pricing approaches. The Hull and White and CIR extended models are used to represent the evolution of mortality over time. We use data for Belgian population to derive prices for the proposed longevity linked securities based on the different methods.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2019
    Description: A recently introduced accounting standard, namely the International Financial Reporting Standard 9, requires banks to build provisions based on forward-looking expected loss models. When there is a significant increase in credit risk of a loan, additional provisions must be charged to the income statement. Banks need to set for each loan a threshold defining what such a significant increase in credit risk constitutes. A low threshold allows banks to recognize credit risk early, but leads to income volatility. We introduce a statistical framework to model this trade-off between early recognition of credit risk and avoidance of excessive income volatility. We analyze the resulting optimization problem for different models, relate it to the banking stress test of the European Union, and illustrate it using default data by Standard and Poor’s.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2019
    Description: As decarbonisation progresses and conventional thermal generation gradually gives way to other technologies including intermittent renewables, there is an increasing requirement for system balancing from new and also fast-acting sources such as battery storage. In the deregulated context, this raises questions of market design and operational optimisation. In this paper, we assess the real option value of an arrangement under which an autonomous energy-limited storage unit sells incremental balancing reserve. The arrangement is akin to a perpetual American swing put option with random refraction times, where a single incremental balancing reserve action is sold at each exercise. The power used is bought in an energy imbalance market (EIM), whose price we take as a general regular one-dimensional diffusion. The storage operator’s strategy and its real option value are derived in this framework by solving the twin timing problems of when to buy power and when to sell reserve. Our results are illustrated with an operational and economic analysis using data from the German Amprion EIM.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2019
    Description: We investigate a shot noise process with subexponential shot marks occurring at renewal epochs. Our main result is a precise asymptotic formula for its tail probability. In doing so, some recent results regarding sums of randomly weighted subexponential random variables play a crucial role.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2019
    Description: This paper discusses ambiguity in the context of single-name credit risk. We focus on uncertainty in the default intensity but also discuss uncertainty in the recovery in a fractional recovery of the market value. This approach is a first step towards integrating uncertainty in credit-risky term structure models and can profit from its simplicity. We derive drift conditions in a Heath–Jarrow–Morton forward rate setting in the case of ambiguous default intensity in combination with zero recovery, and in the case of ambiguous fractional recovery of the market value.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2019
    Description: In this paper, we study a generalised CIR process with externally-exciting and self-exciting jumps, and focus on the distributional properties and applications of this process and its aggregated process. The aim of the paper is to introduce a more general process that includes many models in the literature with self-exciting and external-exciting jumps. The first and second moments of this jump-diffusion process are used to calculate the insurance premium based on mean-variance principle. The Laplace transform of aggregated process is derived, and this leads to an application for pricing default-free bonds which could capture the impacts of both exogenous and endogenous shocks. Illustrative numerical examples and comparisons with other models are also provided.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2019
    Description: We present an approach to individual claims reserving and claim watching in general insurance based on classification and regression trees (CART). We propose a compound model consisting of a frequency section, for the prediction of events concerning reported claims, and a severity section, for the prediction of paid and reserved amounts. The formal structure of the model is based on a set of probabilistic assumptions which allow the provision of sound statistical meaning to the results provided by the CART algorithms. The multiperiod predictions required for claims reserving estimations are obtained by compounding one-period predictions through a simulation procedure. The resulting dynamic model allows the joint modeling of the case reserves, which usually yields useful predictive information. The model also allows predictions under a double-claim regime, i.e., when two different types of compensation can be required by the same claim. Several explicit numerical examples are provided using motor insurance data. For a large claims portfolio we derive an aggregate reserve estimate obtained as the sum of individual reserve estimates and we compare the result with the classical chain-ladder estimate. Backtesting exercises are also proposed concerning event predictions and claim-reserve estimates.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2019
    Description: Rigorous peer-review is the corner-stone of high-quality academic publishing. [...]
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2019
    Description: We propose a statistical measure, based on correlation networks, to evaluate the systemic risk that could arise from the resolution of a failing or likely-to-fail financial institution, under three alternative scenarios: liquidation, private recapitalization, or bail-in. The measure enhances the observed CDS spreads with a risk premium that derives from contagion effects across financial institutions. The empirical findings reveal that the recapitalization of a distressed bank performed by the other banks in the system and the bail-in resolution minimize the potential losses for the banking sector with respect to the liquidation scenario, thus posing limited systemic risks. A closer comparison between the private intervention recapitalization and the bail-in tool shows that the latter slightly reduces contagion effects with respect to the private intervention scenario.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2019
    Description: The aim of this project is to analyze high-frequency GPS location data (second per second) of individual car drivers (and trips). We extract feature information about speeds, acceleration, deceleration, and changes of direction from this high-frequency GPS location data. Time series of this feature information allow us to appropriately allocate individual car driving trips to selected drivers using convolutional neural networks.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2019
    Description: We use the theory of coherent measures to look at the problem of surplus sharing in an insurance business. The surplus share of an insured is calculated by the surplus premium in the contract. The theory of coherent risk measures and the resulting capital allocation gives a way to divide the surplus between the insured and the capital providers, i.e., the shareholders.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2019
    Description: Quantiles of probability distributions play a central role in the definition of risk measures (e.g., value-at-risk, conditional tail expectation) which in turn are used to capture the riskiness of the distribution tail. Estimates of risk measures are needed in many practical situations such as in pricing of extreme events, developing reserve estimates, designing risk transfer strategies, and allocating capital. In this paper, we present the empirical nonparametric and two types of parametric estimators of quantiles at various levels. For parametric estimation, we employ the maximum likelihood and percentile-matching approaches. Asymptotic distributions of all the estimators under consideration are derived when data are left-truncated and right-censored, which is a typical loss variable modification in insurance. Then, we construct relative efficiency curves (REC) for all the parametric estimators. Specific examples of such curves are provided for exponential and single-parameter Pareto distributions for a few data truncation and censoring cases. Additionally, using simulated data we examine how wrong quantile estimates can be when one makes incorrect modeling assumptions. The numerical analysis is also supplemented with standard model diagnostics and validation (e.g., quantile-quantile plots, goodness-of-fit tests, information criteria) and presents an example of when those methods can mislead the decision maker. These findings pave the way for further work on RECs with potential for them being developed into an effective diagnostic tool in this context.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2019
    Description: In the presence of recovery risk, the recovery rate is a random variable whose risk-neutral expectation can be inferred from the prices of defaultable instruments. I extract market-implied recovery rates from the term structures of credit default swap spreads for a sample of 497 United States (U.S.) corporate issuers over the 2005–2014 period. I analyze the explanatory factors of market-implied recovery rates within a linear regression framework and also within a Tobit model, and I compare them with the determinants of historical recovery rates that were previously identified in the literature. In contrast to their historical counterparts, market-implied recovery rates are mostly driven by macroeconomic factors and long-term, issuer-specific variables. Short-term financial variables and industry conditions significantly impact the slope of market-implied recovery rates. These results indicate that the design of a recovery risk model should be based on specific market factors, not on the statistical evidence that is provided by historical recovery rates.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Published by MDPI
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...