ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (125)
  • Molecular Diversity Preservation International  (125)
  • Institute of Electrical and Electronics Engineers (IEEE)
  • 2020-2022
  • 2015-2019  (125)
  • 2010-2014
  • 1990-1994
  • 1945-1949
  • 2019  (125)
  • 2010
  • 195054
  • Economics  (125)
  • Computer Science
Collection
  • Articles  (125)
Publisher
  • Molecular Diversity Preservation International  (125)
  • Institute of Electrical and Electronics Engineers (IEEE)
  • MDPI  (115)
Years
  • 2020-2022
  • 2015-2019  (125)
  • 2010-2014
  • 1990-1994
  • 1945-1949
Year
Journal
Topic
  • Economics  (125)
  • Computer Science
  • 1
    Publication Date: 2019-12-13
    Description: This paper proposes a new method to model loss given default (LGD) for IFRS 9 purposes. We develop two models for the purposes of this paper—LGD1 and LGD2. The LGD1 model is applied to the non-default (performing) accounts and its empirical value based on a specified reference period using a lookup table. We also segment this across the most important variables to obtain a more granular estimate. The LGD2 model is applied to defaulted accounts and we estimate the model by means of an exposure weighted logistic regression. This newly developed LGD model is tested on a secured retail portfolio from a bank. We compare this weighted logistic regression (WLR) (under the assumption of independence) with generalised estimating equations (GEEs) to test the effects of disregarding the dependence among the repeated observations per account. When disregarding this dependence in the application of WLR, the standard errors of the parameter estimates are underestimated. However, the practical effect of this implementation in terms of model accuracy is found to be negligible. The main advantage of the newly developed methodology is the simplicity of this well-known approach, namely logistic regression of binned variables, resulting in a scorecard format.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2019-12-12
    Description: The aim of this work is to assess systemic risk of Tunisian listed banks. The goal is to identify the institutions that contribute the most to systemic risk and that are most exposed to it. We use the CoVaR that considered the systemic risk as the value at risk (VaR) of a financial institution conditioned on the VaR of another institution. Thus, if the CoVaR increases with respect to the VaR, the spillover risk also increases among the institutions. The difference between these measurements is termed △CoVaR, and it allows for estimating the exposure and contribution of each bank to systemic risk. Results allow classifying Tunisian banks in terms of systemic risk involvement. They show that public banks occupy the top places, followed by the two largest private banks in Tunisia. These five banks are the main systemic players in the Tunisian banking sector. It seems that they are the least sensitive to the financial difficulties of existing banks and the most important contributors to the distress of the other banks. This work aims to add a broader perspective to the micro prudential application of regulation, including contagion, proposing a macro prudential vision and strengthening of regulatory policy. Supervisors could impose close supervision for institutions considered as potentially systemic banks. Furthermore, regulations should consider the systemic contribution when defining risk requirements to minimize the consequences of possible herd behavior.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2019-12-11
    Description: This paper considers the Brownian perturbed Cramér–Lundberg risk model with a dividends barrier. We study various types of Padé approximations and Laguerre expansions to compute or approximate the scale function that is necessary to optimize the dividends barrier. We experiment also with a heavy-tailed claim distribution for which we apply the so-called “shifted” Padé approximation.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2019-12-13
    Description: We develop valuation and risk techniques for the future benefits of a retiree who participates in the American Social Security program based on their chosen date of retirement, the term structure of interest rates, and forecasted life expectancy. These valuation methods are then used to determine the optimal retirement time of a beneficiary given a specific wage history and health profile in the sense of maximizing the present value of cash flows received during retirement years. We then examine how a number of risk factors including interest rates, disease diagnosis, and mortality risks impact benefit value. Specifically, we utilize principal component analysis in order to assess both interest rate and mortality risk. We then conduct numerical studies to examine how such risks range over distinct income and demographic groups and finally summarize future research directions.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2019-12-10
    Description: In this paper, we study a stochastic control problem faced by an insurance company allowed to pay out dividends and make capital injections. As in (Løkka and Zervos (2008); Lindensjö and Lindskog (2019)), for a Brownian motion risk process, and in Zhu and Yang (2016), for diffusion processes, we will show that the so-called Løkka–Zervos alternative also holds true in the case of a Cramér–Lundberg risk process with exponential claims. More specifically, we show that: if the cost of capital injections is low, then according to a double-barrier strategy, it is optimal to pay dividends and inject capital, meaning ruin never occurs; and if the cost of capital injections is high, then according to a single-barrier strategy, it is optimal to pay dividends and never inject capital, meaning ruin occurs at the first passage below zero.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2019-12-10
    Description: A new Bornhuetter–Ferguson method is suggested herein. This is a variant of the traditional chain ladder method. The actuary can adjust the relative ultimates using externally estimated relative ultimates. These correspond to linear constraints on the Poisson likelihood underpinning the chain ladder method. Adjusted cash flow estimates were obtained as constrained maximum likelihood estimates. The statistical derivation of the new method is provided in the generalised linear model framework. A related approach in the literature, combining unconstrained and constrained maximum likelihood estimates, is presented in the same framework and compared theoretically. A data illustration is described using a motor portfolio from a Greek insurer.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2019-11-19
    Description: The Segerdahl-Tichy Process, characterized by exponential claims and state dependent drift, has drawn a considerable amount of interest, due to its economic interest (it is the simplest risk process which takes into account the effect of interest rates). It is also the simplest non-Lévy, non-diffusion example of a spectrally negative Markov risk model. Note that for both spectrally negative Lévy and diffusion processes, first passage theories which are based on identifying two “basic” monotone harmonic functions/martingales have been developed. This means that for these processes many control problems involving dividends, capital injections, etc., may be solved explicitly once the two basic functions have been obtained. Furthermore, extensions to general spectrally negative Markov processes are possible; unfortunately, methods for computing the basic functions are still lacking outside the Lévy and diffusion classes. This divergence between theoretical and numerical is strikingly illustrated by the Segerdahl process, for which there exist today six theoretical approaches, but for which almost nothing has been computed, with the exception of the ruin probability. Below, we review four of these methods, with the purpose of drawing attention to connections between them, to underline open problems, and to stimulate further work.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2019-11-04
    Description: The purpose of this paper is to evaluate and estimate market risk for the ten major industries in Vietnam. The focus of the empirical analysis is on the energy sector, which has been designated as one of the four key industries, together with services, food, and telecommunications, targeted for economic development by the Vietnam Government through to 2020. The oil and gas industry is a separate energy-related major industry, and it is evaluated separately from energy. The data set is from 2009 to 2017, which is decomposed into two distinct sub-periods after the Global Financial Crisis (GFC), namely the immediate post-GFC (2009–2011) period and the normal (2012–2017) period, in order to identify the behavior of market risk for Vietnam’s major industries. For the stock market in Vietnam, the website used in this paper provided complete and detailed data for each stock, as classified by industry. Two widely used approaches to measure and analyze risk are used in the empirical analysis, namely Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR). The empirical findings indicate that Energy and Pharmaceuticals are the least risky industries, whereas oil and gas and securities have the greatest risk. In general, there is strong empirical evidence that the four key industries display relatively low risk. For public policy, the Vietnam Government’s proactive emphasis on the targeted industries, including energy, to achieve sustainable economic growth and national economic development, seems to be working effectively. This paper presents striking empirical evidence that Vietnam’s industries have substantially improved their economic performance over the full sample, moving from relatively higher levels of market risk in the immediate post-GFC period to a lower risk environment in a normal period several years after the end of the calamitous GFC.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2019-11-01
    Description: In this paper, we solve the problem of mid price movements arising in high-frequency and algorithmic trading using real data. Namely, we introduce different new types of General Compound Hawkes Processes (GCHPDO, GCHP2SDO, GCHPnSDO) and find their diffusive limits to model the mid price movements of 6 stocks-EBAY, FB, MU, PCAR, SMH, CSCO. We also define error rates to estimate the models fitting accuracy. Maximum Likelihood Estimation (MLE) and Particle Swarm Optimization (PSO) are used for Hawkes processes and models parameters’ calibration.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2019-10-28
    Description: This paper investigates option valuation when the underlying market suffers from illiquidity of price impact. Using option data, I infer trading activities and price impacts on the buy side and the sell side in the stock market from option prices across maturities. The finding displays that the stock market is active when the stock prices plummet, but becomes silent after the market crashes. In addition, the difference of option implied price impacts between the buy side and the sell side, which indicates asymmetric liquidity, increases with the time to maturity, especially on the day of the market crisis. Moreover, investors have different perspectives on the future liquidity after liquidity shocks when they are in a bull market or in a bear market according to the option implied price impact (or market depth) curves. I also calibrate three market indices simultaneously and reach the same conclusion that the three markets become erratic on the event date and calm down in the aftermath.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2019-10-01
    Description: Global investors’ investment in local currency bonds, especially Korea Treasury Bonds, has increased significantly since the mid-2000s, and their influence on bonds and financial markets has grown consistently. In this paper, we investigate global investor’s priority of decision factors in investing in Korea Treasury Bonds by distributing a pairwise comparative survey to experts and analyzing the results using the analytical hierarchy process technique. For analysis, we created model frames with experts in the field of investment based on literature analysis, selected survey participants by considering their institution of their employment, work experience and region, and obtained responses. We find that investors with short-term investment propensities are more sensitive to international and domestic factors and less to risk factors, and more heavily influenced by U.S. dollar funding conditions. On the other hand, investors with long-term investment tendencies are found to be more sensitive to international and risk factors as opposed to domestic factors, and influenced by: global policy rate decisions and fiscal soundness, sovereign credit rating, possible global economic recession, and geographical risks. Our findings not only contribute to enhancing investors’ understanding of the Korean bond market by discussing consensus among investors, but also provide policy implications for Korean government policymakers who need stable and sustained funding.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2019-09-29
    Description: Since the 2008–2009 financial crisis, banks have introduced a family of X-valuation adjustments (XVAs) to quantify the cost of counterparty risk and of its capital and funding implications. XVAs represent a switch of paradigm in derivative management, from hedging to balance sheet optimization. They reflect market inefficiencies that should be compressed as much as possible. In this work, we present a genetic algorithm applied to the compression of credit valuation adjustment (CVA), the expected cost of client defaults to a bank. The design of the algorithm is fine-tuned to the hybrid structure, both discrete and continuous parameter, of the corresponding high-dimensional and nonconvex optimization problem. To make intensive trade incremental XVA computations practical in real-time as required for XVA compression purposes, we propose an approach that circumvents portfolio revaluation at the cost of disk memory, storing the portfolio exposure of the night so that the exposure of the portfolio augmented by a new deal can be obtained at the cost of computing the exposure of the new deal only. This is illustrated by a CVA compression case study on real swap portfolios.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2019-09-17
    Description: This paper considers fundamental questions of arbitrage pricing that arises when the uncertainty model incorporates ambiguity about risk. This additional ambiguity motivates a new principle of risk- and ambiguity-neutral valuation as an extension of the paper by Ross (1976) (Ross, Stephen A. 1976. The arbitrage theory of capital asset pricing. Journal of Economic Theory 13: 341–60). In the spirit of Harrison and Kreps (1979) (Harrison, J. Michael, and David M. Kreps. 1979. Martingales and arbitrage in multiperiod securities markets. Journal of Economic Theory 20: 381–408), the paper establishes a micro-economic foundation of viability in which ambiguity-neutrality imposes a fair-pricing principle via symmetric multiple prior martingales. The resulting equivalent symmetric martingale measure set exists if the uncertain volatility in asset prices is driven by an ambiguous Brownian motion.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2019-10-12
    Description: We present an approach to individual claims reserving and claim watching in general insurance based on classification and regression trees (CART). We propose a compound model consisting of a frequency section, for the prediction of events concerning reported claims, and a severity section, for the prediction of paid and reserved amounts. The formal structure of the model is based on a set of probabilistic assumptions which allow the provision of sound statistical meaning to the results provided by the CART algorithms. The multiperiod predictions required for claims reserving estimations are obtained by compounding one-period predictions through a simulation procedure. The resulting dynamic model allows the joint modeling of the case reserves, which usually yields useful predictive information. The model also allows predictions under a double-claim regime, i.e., when two different types of compensation can be required by the same claim. Several explicit numerical examples are provided using motor insurance data. For a large claims portfolio we derive an aggregate reserve estimate obtained as the sum of individual reserve estimates and we compare the result with the classical chain-ladder estimate. Backtesting exercises are also proposed concerning event predictions and claim-reserve estimates.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2019-09-20
    Description: In this paper, a new heavy-tailed distribution, the mixture Pareto-loggamma distribution, derived through an exponential transformation of the generalized Lindley distribution is introduced. The resulting model is expressed as a convex sum of the classical Pareto and a special case of the loggamma distribution. A comprehensive exploration of its statistical properties and theoretical results related to insurance are provided. Estimation is performed by using the method of log-moments and maximum likelihood. Also, as the modal value of this distribution is expressed in closed-form, composite parametric models are easily obtained by a mode matching procedure. The performance of both the mixture Pareto-loggamma distribution and composite models are tested by employing different claims datasets.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2019-08-27
    Description: I derive practical formulas for optimal arrangements between sophisticated stock market investors (continuous-time Kelly gamblers or, more generally, CRRA investors) and the brokers who lend them cash for leveraged bets on a high Sharpe asset (i.e., the market portfolio). Rather than, say, the broker posting a monopoly price for margin loans, the gambler agrees to use a greater quantity of margin debt than he otherwise would in exchange for an interest rate that is lower than the broker would otherwise post. The gambler thereby attains a higher asymptotic capital growth rate and the broker enjoys a greater rate of intermediation profit than would be obtained under non-cooperation. If the threat point represents a complete breakdown of negotiations (resulting in zero margin loans), then we get an elegant rule of thumb: r L * = 3 / 4 r + 1 / 4 ν − σ 2 / 2 , where r is the broker’s cost of funds, ν is the compound-annual growth rate of the market index, and σ is the annual volatility. We show that, regardless of the particular threat point, the gambler will negotiate to size his bets as if he himself could borrow at the broker’s call rate.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2019-08-26
    Description: Credit default swap (CDS) spreads measure the default risk of the reference entity and have been frequently used in recent empirical papers. To provide a rigorous econometrics foundation for empirical CDS analysis, this paper applies the augmented Dickey–Fuller, Phillips–Perron, Kwiatkowski–Phillips–Schmidt–Shin, and Ng–Perron tests to study the unit root property of CDS spreads, and it uses the Phillips–Ouliaris–Hansen tests to determine whether they are cointegrated. The empirical sample consists of daily CDS spreads of the six large U.S. banks from 2001 to 2018. The main findings are that it is log, not raw, CDS spreads that are unit root processes, and that log CDS spreads are cointegrated. These findings imply that, even though the risks of individual banks may deviate from each other in the short run, there is a long-run relation that ties them together. As these CDS spreads are an important input for financial systemic risk, there are at least two policy implications. First, in monitoring systemic risk, policymakers should focus on long-run trends rather than short-run fluctuations of CDS spreads. Second, in controlling systemic risk, policy measures that reduce the long-run risks of individual banks, such as stress testing and capital buffers, are helpful in mitigating overall systemic risk.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2019-08-26
    Description: This paper provides a critical analysis of the subadditivity axiom, which is the key condition for coherent risk measures. Contrary to the subadditivity assumption, bank mergers can create extra risk. We begin with an analysis how a merger affects depositors, junior or senior bank creditors, and bank owners. Next it is shown that bank mergers can result in higher payouts having to be made by the deposit insurance scheme. Finally, we demonstrate that if banks are interconnected via interbank loans, a bank merger could lead to additional contagion risks. We conclude that the subadditivity assumption should be rejected, since a subadditive risk measure, by definition, cannot account for such increased risks.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2019-08-25
    Description: This paper examines the bank liquidity risk while using a maturity mismatch indicator of loans and deposits (LTDm) during a specific period. Core banking activities that are based on the process of maturity transformation are the most exposed to liquidity risk. The financial crisis in 2007–2009 highlighted the importance of liquidity to the functioning of both the financial markets and the banking sector. We investigate how characteristics of a bank, such as size, capital, and business model, are related to liquidity risk, while using a sample of European banks in the period after the financial crisis, from 2011 to 2017. While employing a generalized method of moment two-step estimator, we find that the banking size increases the liquidity risk, whereas capital is not an effective deterrent. Moreover, our findings reveal that, for savings banks, income diversification raises the liquidity risk while investment banks reliant on non-deposit funding decrease the exposure to liquidity risk.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2019-08-15
    Description: With the expected discontinuation of the LIBOR publication, a robust fallback for related financial instruments is paramount. In recent months, several consultations have taken place on the subject. The results of the first ISDA consultation have been published in November 2018 and a new one just finished at the time of writing. This note describes issues associated to the proposed approaches and potential alternative approaches in the framework and the context of quantitative finance. It evidences a clear lack of details and lack of measurability of the proposed approaches which would not be achievable in practice. It also describes the potential of asymmetrical information between market participants coming from the adjustment spread computation. In the opinion of this author, a fundamental revision of the fallback’s foundations is required.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2019-07-07
    Description: We propose an alternative approach to the modeling of the positive dependence between the probability of default and the loss given default in a portfolio of exposures, using a bivariate urn process. The model combines the power of Bayesian nonparametrics and statistical learning, allowing for the elicitation and the exploitation of experts’ judgements, and for the constant update of this information over time, every time new data are available. A real-world application on mortgages is described using the Single Family Loan-Level Dataset by Freddie Mac.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2019-07-12
    Description: In this paper, we propose models for non-life loss reserving combining traditionalapproaches such as Mack’s or generalized linear models and gradient boosting algorithm in anindividual framework. These claim-level models use information about each of the payments madefor each of the claims in the portfolio, as well as characteristics of the insured. We provide an examplebased on a detailed dataset from a property and casualty insurance company. We contrast sometraditional aggregate techniques, at the portfolio-level, with our individual-level approach and wediscuss some points related to practical applications.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2019-07-03
    Description: We consider de Finetti’s stochastic control problem when the (controlled) process is allowed to spend time under the critical level. More precisely, we consider a generalized version of this control problem in a spectrally negative Lévy model with exponential Parisian ruin. We show that, under mild assumptions on the Lévy measure, an optimal strategy is formed by a barrier strategy and that this optimal barrier level is always less than the optimal barrier level when classical ruin is implemented. In addition, we give necessary and sufficient conditions for the barrier strategy at level zero to be optimal.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2019-07-17
    Description: It is intuitive that proximity to hospitals can only improve the chances of survival from a range of medical conditions. This study examines the empirical evidence for this assertion, based on Australian data. While hospital proximity might serve as a proxy for other factors, such as indigenity, income, wealth or geography, the evidence suggests that proximity provides the most direct link to these factors. In addition, as it turns out, a very statistically significant one that transcends economies.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2019-07-15
    Description: We analyzed real telematics information for a sample of drivers with usage-based insurance policies. We examined the statistical distribution of distance driven above the posted speed limit—which presents a strong positive asymmetry—using quantile regression models. We found that, at different percentile levels, the distance driven at speeds above the posted limit depends on total distance driven and, more generally, on factors such as the percentage of urban and nighttime driving and on the driver’s gender. However, the impact of these covariates differs according to the percentile level. We stress the importance of understanding telematics information, which should not be limited to simply characterizing average drivers, but can be useful for signaling dangerous driving by predicting quantiles associated with specific driver characteristics. We conclude that the risk of driving for long distances above the speed limit is heterogeneous and, moreover, we show that prevention campaigns should target primarily male non-urban drivers, especially if they present a high percentage of nighttime driving.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2019-07-07
    Description: In this paper, the generalized Pareto distribution (GPD) copula approach is utilized to solve the conditional value-at-risk (CVaR) portfolio problem. Particularly, this approach used (i) copula to model the complete linear and non-linear correlation dependence structure, (ii) Pareto tails to capture the estimates of the parametric Pareto lower tail, the non-parametric kernel-smoothed interior and the parametric Pareto upper tail and (iii) Value-at-Risk (VaR) to quantify risk measure. The simulated sample covers the G7, BRICS (association of Brazil, Russia, India, China and South Africa) and 14 popular emerging stock-market returns for the period between 1997 and 2018. Our results suggest that the efficient frontier with the minimizing CVaR measure and simulated copula returns combined outperforms the risk/return of domestic portfolios, such as the US stock market. This result improves international diversification at the global level. We also show that the Gaussian and t-copula simulated returns give very similar but not identical results. Furthermore, the copula simulation provides more accurate market-risk estimates than historical simulation. Finally, the results support the notion that G7 countries can provide an important opportunity for diversification. These results are important to investors and policymakers.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2019-07-03
    Description: The Hierarchical risk parity (HRP) approach of portfolio allocation, introduced by Lopez de Prado (2016), applies graph theory and machine learning to build a diversified portfolio. Like the traditional risk-based allocation methods, HRP is also a function of the estimate of the covariance matrix, however, it does not require its invertibility. In this paper, we first study the impact of covariance misspecification on the performance of the different allocation methods. Next, we study under an appropriate covariance forecast model whether the machine learning based HRP outperforms the traditional risk-based portfolios. For our analysis, we use the test for superior predictive ability on out-of-sample portfolio performance, to determine whether the observed excess performance is significant or if it occurred by chance. We find that when the covariance estimates are crude, inverse volatility weighted portfolios are more robust, followed by the machine learning-based portfolios. Minimum variance and maximum diversification are most sensitive to covariance misspecification. HRP follows the middle ground; it is less sensitive to covariance misspecification when compared with minimum variance or maximum diversification portfolio, while it is not as robust as the inverse volatility weighed portfolio. We also study the impact of the different rebalancing horizon and how the portfolios compare against a market-capitalization weighted portfolio.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2019-07-06
    Description: In this paper, we measure the systemic risk with a novel methodology, based on a “spatial-temporal” approach. We propose a new bank systemic risk measure to consider the two components of systemic risk: cross-sectional and time dimension. The aim is to highlight the “time-space dynamics” of contagion, i.e., if the CDS spread of bank i depends on the CDS spread of other banks. To do this, we use an advanced spatial econometrics design with a time-varying spatial dependence that can be interpreted as an index of the degree of cross-sectional spillovers. The findings highlight that the Eurozone banks have strong spatial dependence in the evolution of CDS spread, namely the contagion effect is present and persistent. Moreover, we analyse the role of the European Central Bank in managing contagion risk. We find that monetary policy has been effective in reducing systemic risk. However, the results show that systemic risk does not imply a policy intervention, highlighting how financial stability policy is not yet an objective.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2019-06-13
    Description: In this research, trade credit is analysed form a seller (supplier) perspective. Trade credit allows the supplier to increase sales and profits but creates the risk that the customer will not pay, and at the same time increases the risk of the supplier’s insolvency. If the supplier is a small or micro-enterprise (SMiE), it is usually an issue of human and technical resources. Therefore, when dealing with these issues, the supplier needs a high accuracy but simple and highly interpretable trade credit risk assessment model that allows for assessing the risk of insolvency of buyers (who are usually SMiE). The aim of the research is to create a statistical enterprise trade credit risk assessment (ETCRA) model for Lithuanian small and micro-enterprises (SMiE). In the empirical analysis, the financial and non-financial data of 734 small and micro-sized enterprises in the period of 2010–2012 were chosen as the samples. Based on the logistic regression, the ETCRA model was developed using financial and non-financial variables. In the ETCRA model, the enterprise’s financial performance is assessed from different perspectives: profitability, liquidity, solvency, and activity. Varied model variants have been created using (i) only financial ratios and (ii) financial ratios and non-financial variables. Moreover, the inclusion of non-financial variables in the model does not substantially improve the characteristics of the model. This means that the models that use only financial ratios can be used in practice, and the models that include non-financial variables can also be used. The designed models can be used by suppliers when making decisions of granting a trade credit for small or micro-enterprises.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2019-06-30
    Description: In this study, we consider the problem of zero claims in a liability insurance portfolio and compare the predictability of three models. We use French motor third party liability (MTPL) insurance data, which has been used for a pricing game, and show that how the type of coverage and policyholders’ willingness to subscribe to insurance pricing, based on telematics data, affects their driving behaviour and hence their claims. Using our validation set, we then predict the number of zero claims. Our results show that although a zero-inflated Poisson (ZIP) model performs better than a Poisson regression, it can even be outperformed by logistic regression.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2019-06-20
    Description: XGBoost is recognized as an algorithm with exceptional predictive capacity. Models for a binary response indicating the existence of accident claims versus no claims can be used to identify the determinants of traffic accidents. This study compared the relative performances of logistic regression and XGBoost approaches for predicting the existence of accident claims using telematics data. The dataset contained information from an insurance company about the individuals’ driving patterns—including total annual distance driven and percentage of total distance driven in urban areas. Our findings showed that logistic regression is a suitable model given its interpretability and good predictive capacity. XGBoost requires numerous model-tuning procedures to match the predictive performance of the logistic regression model and greater effort as regards to interpretation.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2019-07-01
    Description: Solvency II requirements introduced new issues for actuarial risk management in non-life insurance, challenging the market to have a consciousness of its own risk profile, and also investigating the sensitivity of the solvency ratio depending on the insurance risks and technical results on either a short-term and medium-term perspective. For this aim, in the present paper, a partial internal model for premium risk is developed for three multi-line non-life insurers, and the impact of some different business mixes is analyzed. Furthermore, the risk-mitigation and profitability impact of reinsurance in the premium risk model are introduced, and a global framework for a feasible application of this model consistent with a medium-term analysis is provided. Numerical results are also figured out with evidence of various effects for several portfolios and reinsurance arrangements, pointing out the main reasons for these differences.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2019-06-04
    Description: In the past two decades increasing computational power resulted in the development of more advanced claims reserving techniques, allowing the stochastic branch to overcome the deterministic methods, resulting in forecasts of enhanced quality. Hence, not only point estimates, but predictive distributions can be generated in order to forecast future claim amounts. The significant expansion in the variety of models requires the validation of these methods and the creation of supporting techniques for appropriate decision making. The present article compares and validates several existing and self-developed stochastic methods on actual data applying comparison measures in an algorithmic manner.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2019-06-05
    Description: We investigate a shot noise process with subexponential shot marks occurring at renewal epochs. Our main result is a precise asymptotic formula for its tail probability. In doing so, some recent results regarding sums of randomly weighted subexponential random variables play a crucial role.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    facet.materialart.
    Unknown
    Molecular Diversity Preservation International
    Publication Date: 2019-06-10
    Description: This paper discusses ambiguity in the context of single-name credit risk. We focus on uncertainty in the default intensity but also discuss uncertainty in the recovery in a fractional recovery of the market value. This approach is a first step towards integrating uncertainty in credit-risky term structure models and can profit from its simplicity. We derive drift conditions in a Heath–Jarrow–Morton forward rate setting in the case of ambiguous default intensity in combination with zero recovery, and in the case of ambiguous fractional recovery of the market value.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2019-06-12
    Description: This work examines apportionment of multiplicative risks by considering three dominance orderings: first-degree stochastic dominance, Rothschild and Stiglitz’s increase in risk and downside risk increase. We use the relative nth-degree risk aversion measure and decreasing relative nth-degree risk aversion to provide conditions guaranteeing the preference for “harm disaggregation” of multiplicative risks. Further, we relate our conclusions to the preference toward bivariate lotteries, which interpret correlation-aversion, cross-prudence and cross-temperance.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2019-06-17
    Description: It is known that the classical ruin function under exponential claim-size distribution depends on two parameters, which are referred to as the mean claim size and the relative security loading. These parameters are assumed to be unknown and random, thus, a loss function that measures the loss sustained by a decision-maker who takes as valid a ruin function which is not correct can be considered. By using squared-error loss function and appropriate distribution function for these parameters, the issue of estimating the ruin function derives in a mixture procedure. Firstly, a bivariate distribution for mixing jointly the two parameters is considered, and second, different univariate distributions for mixing both parameters separately are examined. Consequently, a catalogue of ruin probability functions and severity of ruin, which are more flexible than the original one, are obtained. The methodology is also extended to the Pareto claim size distribution. Several numerical examples illustrate the performance of these functions.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2019-06-12
    Description: One of the key components of counterparty credit risk (CCR) measurement is generating scenarios for the evolution of the underlying risk factors, such as interest and exchange rates, equity and commodity prices, and credit spreads. Geometric Brownian Motion (GBM) is a widely used method for modeling the evolution of exchange rates. An important limitation of GBM is that, due to the assumption of constant drift and volatility, stylized facts of financial time-series, such as volatility clustering and heavy-tailedness in the returns distribution, cannot be captured. We propose a model where volatility and drift are able to switch between regimes; more specifically, they are governed by an unobservable Markov chain. Hence, we model exchange rates with a hidden Markov model (HMM) and generate scenarios for counterparty exposure using this approach. A numerical study is carried out and backtesting results for a number of exchange rates are presented. The impact of using a regime-switching model on counterparty exposure is found to be profound for derivatives with non-linear payoffs.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2019-01-23
    Description: In this paper, we employ 99% intraday value-at-risk (VaR) and intraday expected shortfall (ES) as risk metrics to assess the competency of the Multiplicative Component Generalised Autoregressive Heteroskedasticity (MC-GARCH) models based on the 1-min EUR/USD exchange rate returns. Five distributional assumptions for the innovation process are used to analyse their effects on the modelling and forecasting performance. The high-frequency volatility models were validated in terms of in-sample fit based on various statistical and graphical tests. A more rigorous validation procedure involves testing the predictive power of the models. Therefore, three backtesting procedures were used for the VaR, namely, the Kupiec’s test, a duration-based backtest, and an asymmetric VaR loss function. Similarly, three backtests were employed for the ES: a regression-based backtesting procedure, the Exceedance Residual backtest and the V-Tests. The validation results show that non-normal distributions are best suited for both model fitting and forecasting. The MC-GARCH(1,1) model under the Generalised Error Distribution (GED) innovation assumption gave the best fit to the intraday data and gave the best results for the ES forecasts. However, the asymmetric Skewed Student’s-t distribution for the innovation process provided the best results for the VaR forecasts. This paper presents the results of the first empirical study (to the best of the authors’ knowledge) in: (1) forecasting the intraday Expected Shortfall (ES) under different distributional assumptions for the MC-GARCH model; (2) assessing the MC-GARCH model under the Generalised Error Distribution (GED) innovation; (3) evaluating and ranking the VaR predictability of the MC-GARCH models using an asymmetric loss function.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2019-10-29
    Description: The misestimation of rating transition probabilities may lead banks to lend money incoherently with borrowers’ default trajectory, causing both a deterioration in asset quality and higher system distress. Applying a Mover-Stayer model to determine the migration risk of small and medium enterprises, we find that banks are over-estimating their credit risk resulting in excessive regulatory capital. This has important macroeconomic implications due to the fact that holding a large capital buffer is costly for banks and this in turn influences their ability to lend in the wider economy. This conclusion is particularly true during economic downturns with the consequence of exacerbating the cyclicality in risk capital that therefore acts to aggravate economic conditions further. We also explain part of the misevaluation of borrowers and the actual relevant weight of non-performing loans within banking portfolios: some of the prudential requirements, at least as regards EMS credit portfolios, cannot be considered effective as envisaged by the regulators who developed the “new” regulation in response to the most recent crisis. The Mover-Stayers approach helps to reduce calculation inaccuracy when analyzing the historical movements of borrowers’ ratings and consequently, improves the efficacy of the resource allocation process and banking industry stability.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2019-10-25
    Description: The aggregation of individual risks into total risk using a weighting variable multiplied by two ratio variables representing incidence and intensity is an important task for risk professionals. For example, expected loss (EL) of a loan is the product of exposure at default (EAD), probability of default (PD), and loss given default (LGD) of the loan. Simple weighted (by EAD) means of PD and LGD are intuitive summaries however they do not satisfy a reconciliation property whereby their product with the total EAD equals the sum of the individual expected losses. This makes their interpretation problematic, especially when trying to ascertain whether changes in EAD, PD, or LGD are responsible for a change in EL. We propose means for PD and LGD that have the property of reconciling at the aggregate level. Properties of the new means are explored, including how changes in EL can be attributed to changes in EAD, PD, and LGD. Other applications such as insurance where the incidence ratio is utilization rate (UR) and the intensity ratio is an average benefit (AB) are discussed and the generalization to products of more than two ratio variables provided.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2019-11-01
    Description: The study of connectedness is key to assess spillover effects and identify lead-lagrelationships among market exchanges trading the same asset. By means of an extension of Dieboldand Yilmaz (2012) econometric connectedness measures, we examined the relationships of five majorBitcoin exchange platforms during two periods of main interest: the 2017 surge in prices and the 2018decline. We concluded that Bitfinex and Gemini are leading exchanges in terms of return spillovertransmission during the analyzed time-frame, while Bittrexs act as a follower. We also found thatconnectedness of overall returns fell substantially right before the Bitcoin price hype, whereas itleveled out during the period the down market period. We confirmed that the results are robust withregards to the modeling strategies.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2019-10-14
    Description: We consider the Sparre Andersen risk process with interclaim times that belong to the class of distributions with rational Laplace transform. We construct error bounds for the ruin probability based on the Pollaczek–Khintchine formula, and develop an efficient algorithm to approximate the ruin probability for completely monotone claim size distributions. Our algorithm improves earlier results and can be tailored towards achieving a predetermined accuracy of the approximation.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2019-11-05
    Description: In this paper, we apply machine learning to forecast the conditional variance of long-term stock returns measured in excess of different benchmarks, considering the short- and long-term interest rate, the earnings-by-price ratio, and the inflation rate. In particular, we apply in a two-step procedure a fully nonparametric local-linear smoother and choose the set of covariates as well as the smoothing parameters via cross-validation. We find that volatility forecastability is much less important at longer horizons regardless of the chosen model and that the homoscedastic historical average of the squared return prediction errors gives an adequate approximation of the unobserved realised conditional variance for both the one-year and five-year horizon.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    facet.materialart.
    Unknown
    Molecular Diversity Preservation International
    Publication Date: 2019-10-18
    Description: First, we give a closed-form formula for first passage time of a reflected Brownian motion with drift. This corrects a formula by Perry et al. (2004). Second, we show that the maximum before a fixed drawdown is exponentially distributed for any drawdown, if and only if the diffusion characteristic μ / σ 2 is constant. This complements the sufficient condition formulated by Lehoczky (1977). Third, we give an alternative proof for the fact that the maximum before a fixed drawdown is exponentially distributed for any spectrally negative Lévy process, a result due to Mijatović and Pistorius (2012). Our proof is similar, but simpler than Lehoczky (1977) or Landriault et al. (2017).
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2019-10-14
    Description: In this paper, we study a generalised CIR process with externally-exciting and self-exciting jumps, and focus on the distributional properties and applications of this process and its aggregated process. The aim of the paper is to introduce a more general process that includes many models in the literature with self-exciting and external-exciting jumps. The first and second moments of this jump-diffusion process are used to calculate the insurance premium based on mean-variance principle. The Laplace transform of aggregated process is derived, and this leads to an application for pricing default-free bonds which could capture the impacts of both exogenous and endogenous shocks. Illustrative numerical examples and comparisons with other models are also provided.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2019-08-26
    Description: Long-term care (LTC) encompasses a set of services provided to impaired and dependent elderly people. To assess the level of the dependence several scales are used, including activities of daily living (ADL), instrumental ADL (IADL) and functional limitations. Once an elderly person fails to perform these activities independently, he or she requires special assistance. Help can be provided as informal care by relatives and as formal care by professionals. The aim of this research is to study individual characteristics that relate to the demand of LTC and to analyze the relation between formal and informal care. We base our study on data from the Swiss Health Survey focusing on respondents aged over 65 years. Using the structural equation modeling technique, we develop a statistical model that considers the dependence concept as a latent variable. This hidden dependence variable combines three indices linked to the limitations in ADL, in IADL and functional limitations. Accounting for causality links between covariates enables us to include the indirect effect of pathologies on the receipt of LTC mediated via dependence. In our model, we do not assume a causal relationship between formal and informal care. From our results, we observe a significant impact of pathologies as well as of the socio-demographic factors on the demand for LTC. The relationship between formal and informal care is found to be of both a complementary and substitutional nature.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2019-08-05
    Description: The Growth-Optimal Portfolio (GOP) theory determines the path of bet sizes that maximize long-term wealth. This multi-horizon goal makes it more appealing among practitioners than myopic approaches, like Markowitz’s mean-variance or risk parity. The GOP literature typically considers risk-neutral investors with an infinite investment horizon. In this paper, we compute the optimal bet sizes in the more realistic setting of risk-averse investors with finite investment horizons. We find that, under this more realistic setting, the optimal bet sizes are considerably smaller than previously suggested by the GOP literature. We also develop quantitative methods for determining the risk-adjusted growth allocations (or risk budgeting) for a given finite investment horizon.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2019-09-01
    Description: Managing unemployment is one of the key issues in social policies. Unemployment insurance schemes are designed to cushion the financial and morale blow of loss of job but also to encourage the unemployed to seek new jobs more proactively due to the continuous reduction of benefit payments. In the present paper, a simple model of unemployment insurance is proposed with a focus on optimality of the individual’s entry to the scheme. The corresponding optimal stopping problem is solved, and its similarity and differences with the perpetual American call option are discussed. Beyond a purely financial point of view, we argue that in the actuarial context the optimal decisions should take into account other possible preferences through a suitable utility function. Some examples in this direction are worked out.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2019-09-07
    Description: It has been six years since Editor-in-Chief Steffensen (2013) wrote in his editorial that “to Risks inclusiveness, inter-disciplinarity, and open-mindedness is the very starting point [...]
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2019-09-16
    Description: We propose a novel approach for loss reserving based on deep neural networks. The approach allows for joint modeling of paid losses and claims outstanding, and incorporation of heterogeneous inputs. We validate the models on loss reserving data across lines of business, and show that they improve on the predictive accuracy of existing stochastic methods. The models require minimal feature engineering and expert input, and can be automated to produce forecasts more frequently than manual workflows.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2019-10-20
    Description: The purpose of this study is to examine the volatility-timing performance of Singapore-based funds under the Central Provident Fund (CPF) Investment Scheme and non-CPF linked funds by taking into account the currency risk effect on internationally managed funds. In particular, we empirically assess whether the funds under the CPF Investment Scheme outperform non-CPF funds by examining the volatility-timing performance associated with these funds. The volatility-timing ability of CPF funds will provide the CPF board with a new method for risk classification. We employ the GARCH models and modified factor models to capture the response of funds to market abnormal conditional volatility including the weekday effect. The SMB and HML factors for non-US based funds are constructed from stock market data to exclude the contribution of the size effect and the BE/ME effect. The results show that volatility timing is one of the factors contributing to the excess return of funds. However, funds’ volatility-timing seems to be country-specific. Most of the Japanese equity funds and global equity funds under the CPF Investment Scheme are found to have the ability of volatility timing. This finding contrasts with the existing studies on Asian, ex-Japan funds and Greater China funds. Moreover, there is no evidence that funds under the CPF Investment Scheme show a better group performance of volatility timing.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2019-08-02
    Description: This paper revisits the spectrally negative Lévy risk process embedded with the general tax structure introduced in Kyprianou and Zhou (2009). A joint Laplace transform is found concerning the first down-crossing time below level 0. The potential density is also obtained for the taxed Lévy risk process killed upon leaving [ 0 , b ] . The results are expressed using scale functions.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2019-08-01
    Description: We consider a two-dimensional ruin problem where the surplus process of business lines is modelled by a two-dimensional correlated Brownian motion with drift. We study the ruin function P ( u ) for the component-wise ruin (that is both business lines are ruined in an infinite-time horizon), where u is the same initial capital for each line. We measure the goodness of the business by analysing the adjustment coefficient, that is the limit of - ln P ( u ) / u as u tends to infinity, which depends essentially on the correlation ρ of the two surplus processes. In order to work out the adjustment coefficient we solve a two-layer optimization problem.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2019-07-19
    Description: The purpose of this paper is to survey recent developments in granular models and machine learning models for loss reserving, and to compare the two families with a view to assessment of their potential for future development. This is best understood against the context of the evolution of these models from their predecessors, and the early sections recount relevant archaeological vignettes from the history of loss reserving. However, the larger part of the paper is concerned with the granular models and machine learning models. Their relative merits are discussed, as are the factors governing the choice between them and the older, more primitive models. Concluding sections briefly consider the possible further development of these models in the future.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2019-08-05
    Description: We obtain closed-form expressions for the value of the joint Laplace transform of therunning maximum and minimum of a diffusion-type process stopped at the first time at which theassociated drawdown or drawup process hits a constant level before an independent exponentialrandom time. It is assumed that the coefficients of the diffusion-type process are regular functionsof the current values of its running maximum and minimum. The proof is based on the solution tothe equivalent inhomogeneous ordinary differential boundary-value problem and the applicationof the normal-reflection conditions for the value function at the edges of the state space of theresulting three-dimensional Markov process. The result is related to the computation of probabilitycharacteristics of the take-profit and stop-loss values of a market trader during a given time period.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2019-08-01
    Description: The sector of SME is the major force for the national economic and social development. Financial risk is one of the key threats to the activity of small and medium enterprises. The most common manifestation of the financial risk of SMEs is difficulty in financing the business and lack of funds for development. Banks are unwilling to grant loans to such companies. Moreover, it is the rising operating costs that cause shrinking profits, which may result in corporate debt, difficulty in debt repayment, and consequently, high financial risk of these entities. Numerous differences in conducting the activity of small and large enterprises intensify this risk and mean that the model of credit financing for companies is not adjusted to the capabilities and principles of the operation of small enterprises. Therefore, risk management is one of the most important internal processes in small and medium enterprises. The identification of factors that affect the level of financial risk in these entities is therefore crucial. The main objective of this research was to analyze the impact of selected parametric characteristics of the SME sector on the intensity of financial risk they take. This objective was accomplished on the basis of the survey with the participation of Polish SMEs. In order to test the adopted research assumptions, the linear regression model was used with four continuous variables for each type of the identified financial risk. Based on the final research results, the logit model was obtained for the risk of insufficient profits. It was indicated that both the internationalization of the company and the ability to manage risk are the only factors that affect a high level of risk of low income. The article ends with the discussion and the comparison with some previous research in this area.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2019-09-01
    Description: In actuarial modelling of risk pricing and loss reserving in general insurance, also known as P&C or non-life insurance, there is business value in the predictive power and automation through machine learning. However, interpretability can be critical, especially in explaining to key stakeholders and regulators. We present a granular machine learning model framework to jointly predict loss development and segment risk pricing. Generalising the Payments per Claim Incurred (PPCI) loss reserving method with risk variables and residual neural networks, this combines interpretable linear and sophisticated neural network components so that the ‘unexplainable’ component can be identified and regularised with a separate penalty. The model is tested for a real-life insurance dataset, and generally outperformed PPCI on predicting ultimate loss for sufficient sample size.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2019-07-07
    Description: The aim of this study was to investigate whether firms’ reporting delays are interconnected with bankruptcy risk and its financial determinants. This study was based on 698,189 firm-year observations from Estonia. Annual report submission delay, either in a binary or ordinal form, was used as the dependent variable, while bankruptcy risk based on an international model or the financial ratios determining it were the independent variables. The findings indicated that firms with lower values of liquidity and annual and accumulated profitability were more likely to delay the submission of an annual report over the legal deadline. In turn, firm leverage was not interconnected with reporting delays. In addition, firms with a higher risk of bankruptcy were more likely to delay the submission of their annual reports. Firms with different ages, sizes and industries varied in respect to the obtained results. Different stakeholders should be aware that when reporting delays occur, these can be conditioned by higher bankruptcy risk or poor performance, and thus, for instance, crediting such firms should be treated with caution. State institutions controlling timely submission should take strict(er) measures in cases of firms delaying for a lengthy period.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2019-06-20
    Description: To reduce the negative impacts of risks in farming due to climate change, the government implemented agricultural production cost insurance in 2015. Although a huge amount of subsidy has been allocated by the government (80 percent of the premium), farmers’ participation rate is still low (23 percent of the target in 2016). In order to solve the issue, it is indispensable to identify farmers’ willingness to pay (WTP) for and determinants of their participation in agricultural production cost insurance. Based on a field survey of 240 smallholder farmers in the Garut District, West Java Province in August–October 2017 and February 2018, the contingent valuation method (CVM) estimated that farmers’ mean willingness to pay (WTP) was Rp 30,358/ha/cropping season ($2.25/ha/cropping season), which was 16 percent lower than the current premium (Rp 36,000/ha/cropping season = $2.67/ha/cropping season). Farmers who participated in agricultural production cost insurance shared some characteristics: operating larger farmland, more contact with agricultural extension service, lower expected production for the next cropping season, and a downstream area location.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2019-05-19
    Description: The primary objective of this work is to analyze model based Value-at-Risk associated with mortality risk arising from issued term life assurance contracts and to compare the results with the capital requirements for mortality risk as determined using Solvency II Standard Formula. In particular, two approaches to calculate Value-at-Risk are analyzed: one-year VaR and run-off VaR. The calculations of Value-at-Risk are performed using stochastic mortality rates which are calibrated using the Lee-Carter model fitted using mortality data of selected European countries. Results indicate that, depending on the approach taken to calculate Value-at-Risk, the key factors driving its relative size are: sensitivity of technical provisions to the latest mortality experience, volatility of mortality rates in a country, policy term and benefit formula. Overall, we found that Solvency II Standard Formula on average delivers an adequate capital requirement, however, we also highlight particular situations where it could understate or overstate portfolio specific model based Value-at-Risk for mortality risk.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2019-05-18
    Description: In the presence of recovery risk, the recovery rate is a random variable whose risk-neutral expectation can be inferred from the prices of defaultable instruments. I extract market-implied recovery rates from the term structures of credit default swap spreads for a sample of 497 United States (U.S.) corporate issuers over the 2005–2014 period. I analyze the explanatory factors of market-implied recovery rates within a linear regression framework and also within a Tobit model, and I compare them with the determinants of historical recovery rates that were previously identified in the literature. In contrast to their historical counterparts, market-implied recovery rates are mostly driven by macroeconomic factors and long-term, issuer-specific variables. Short-term financial variables and industry conditions significantly impact the slope of market-implied recovery rates. These results indicate that the design of a recovery risk model should be based on specific market factors, not on the statistical evidence that is provided by historical recovery rates.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2019-06-01
    Description: This is Part III of a series of papers which focus on a general framework for portfolio theory. Here, we extend a general framework for portfolio theory in a one-period financial market as introduced in Part I [Maier-Paape and Zhu, Risks 2018, 6(2), 53] to multi-period markets. This extension is reasonable for applications. More importantly, we take a new approach, the “modular portfolio theory”, which is built from the interaction among four related modules: (a) multi period market model; (b) trading strategies; (c) risk and utility functions (performance criteria); and (d) the optimization problem (efficient frontier and efficient portfolio). An important concept that allows dealing with the more general framework discussed here is a trading strategy generating function. This concept limits the discussion to a special class of manageable trading strategies, which is still wide enough to cover many frequently used trading strategies, for instance “constant weight” (fixed fraction). As application, we discuss the utility function of compounded return and the risk measure of relative log drawdowns.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2019-06-01
    Description: Stochastic mortality models have been developed for a range of applications from demographic projections to financial management. Financial risk based models built on methods used for interest rates and apply these to mortality rates. They have the advantage of being applied to financial pricing and the management of longevity risk. Olivier and Jeffery (2004) and Smith (2005) proposed a model based on a forward-rate mortality framework with stochastic factors driven by univariate gamma random variables irrespective of age or duration. We assess and further develop this model. We generalize random shocks from a univariate gamma to a univariate Tweedie distribution and allow for the distributions to vary by age. Furthermore, since dependence between ages is an observed characteristic of mortality rate improvements, we formulate a multivariate framework using copulas. We find that dependence increases with age and introduce a suitable covariance structure, one that is related to the notion of ax minimum. The resulting model provides a more realistic basis for capturing the risk of mortality improvements and serves to enhance longevity risk management for pension and insurance funds.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2019-05-06
    Description: Risk models developed on one dataset are often applied to new data and, in such cases, it is prudent to check that the model is suitable for the new data. An important application is in the banking industry, where statistical models are applied to loans to determine provisions and capital requirements. These models are developed on historical data, and regulations require their monitoring to ensure they remain valid on current portfolios—often years since the models were developed. The Population Stability Index (PSI) is an industry standard to measure whether the distribution of the current data has shifted significantly from the distribution of data used to develop the model. This paper explores several disadvantages of the PSI and proposes the Prediction Accuracy Index (PAI) as an alternative. The superior properties and interpretation of the PAI are discussed and it is concluded that the PAI can more accurately summarise the level of population stability, helping risk analysts and managers determine whether the model remains fit-for-purpose.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2019-05-15
    Description: The beta coefficient plays a crucial role in finance as a risk measure of a portfolio in comparison to the benchmark portfolio. In the paper, we investigate statistical properties of the sample estimator for the beta coefficient. Assuming that both the holding portfolio and the benchmark portfolio consist of the same assets whose returns are multivariate normally distributed, we provide the finite sample and the asymptotic distributions of the sample estimator for the beta coefficient. These findings are used to derive a statistical test for the beta coefficient and to construct a confidence interval for the beta coefficient. Moreover, we show that the sample estimator is an unbiased estimator for the beta coefficient. The theoretical results are implemented in an empirical study.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2019-05-01
    Description: We study the optimal excess-of-loss reinsurance problem when both the intensity of the claims arrival process and the claim size distribution are influenced by an exogenous stochastic factor. We assume that the insurer’s surplus is governed by a marked point process with dual-predictable projection affected by an environmental factor and that the insurance company can borrow and invest money at a constant real-valued risk-free interest rate r. Our model allows for stochastic risk premia, which take into account risk fluctuations. Using stochastic control theory based on the Hamilton-Jacobi-Bellman equation, we analyze the optimal reinsurance strategy under the criterion of maximizing the expected exponential utility of the terminal wealth. A verification theorem for the value function in terms of classical solutions of a backward partial differential equation is provided. Finally, some numerical results are discussed.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2019-05-01
    Description: Banks make profits from the difference between short-term and long-term loan interest rates. To issue loans, banks raise funds from capital markets. Since the long-term loan rate is relatively stable, but short-term interest is usually variable, there is an interest rate risk. Therefore, banks need information about the optimal leverage strategies based on the current economic situation. Recent studies on the economic crisis by many economists showed that the crisis was due to too much leveraging by “big banks”. This leveraging turns out to be close to Kelly’s optimal point. It is known that Kelly’s strategy does not address risk adequately. We used the return–drawdown ratio and inflection point of Kelly’s cumulative return curve in a finite investment horizon to derive more conservative leverage levels. Moreover, we carried out a sensitivity analysis to determine strategies during a period of interest rates increase, which is the most important and risky period to leverage. Thus, we brought theoretical results closer to practical applications. Furthermore, by using the sensitivity analysis method, banks can change the allocation sizes to loans with different maturities to mediate the risks corresponding to different monetary policy environments. This provides bank managers flexible tools in mitigating risk.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2019-05-01
    Description: We explore the Monte Carlo steps required to reduce the sampling error of the estimated 99.9% quantile within an acceptable threshold. Our research is of primary interest to practitioners working in the area of operational risk measurement, where the annual loss distribution cannot be analytically determined in advance. Usually, the frequency and the severity distributions should be adequately combined and elaborated with Monte Carlo methods, in order to estimate the loss distributions and risk measures. Naturally, financial analysts and regulators are interested in mitigating sampling errors, as prescribed in EU Regulation 2018/959. In particular, the sampling error of the 99.9% quantile is of paramount importance, along the lines of EU Regulation 575/2013. The Monte Carlo error for the operational risk measure is here assessed on the basis of the binomial distribution. Our approach is then applied to realistic simulated data, yielding a comparable precision of the estimate with a much lower computational effort, when compared to bootstrap, Monte Carlo repetition, and two other methods based on numerical optimization.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2019-05-01
    Description: Two insurance companies I 1 , I 2 with reserves R 1 ( t ) , R 2 ( t ) compete for customers, such that in a suitable differential game the smaller company I 2 with R 2 ( 0 ) 〈 R 1 ( 0 ) aims at minimizing R 1 ( t ) − R 2 ( t ) by using the premium p 2 as control and the larger I 1 at maximizing by using p 1 . Deductibles K 1 , K 2 are fixed but may be different. If K 1 〉 K 2 and I 2 is the leader choosing its premium first, conditions for Stackelberg equilibrium are established. For gamma-distributed rates of claim arrivals, explicit equilibrium premiums are obtained, and shown to depend on the running reserve difference. The analysis is based on the diffusion approximation to a standard Cramér-Lundberg risk process extended to allow investment in a risk-free asset.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2019-05-21
    Description: I document a sizeable bias that might arise when valuing out of the money American options via the Least Square Method proposed by Longstaff and Schwartz (2001). The key point of this algorithm is the regression-based estimate of the continuation value of an American option. If this regression is ill-posed, the procedure might deliver biased results. The price of the American option might even fall below the price of its European counterpart. For call options, this is likely to occur when the dividend yield of the underlying is high. This distortion is documented within the standard Black–Scholes–Merton model as well as within its most common extensions (the jump-diffusion, the stochastic volatility and the stochastic interest rates models). Finally, I propose two easy and effective workarounds that fix this distortion.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2019-05-02
    Description: An accurate assessment of the risk of extreme environmental events is of great importance for populations, authorities and the banking/insurance/reinsurance industry. Koch (2017) introduced a notion of spatial risk measure and a corresponding set of axioms which are well suited to analyze the risk due to events having a spatial extent, precisely such as environmental phenomena. The axiom of asymptotic spatial homogeneity is of particular interest since it allows one to quantify the rate of spatial diversification when the region under consideration becomes large. In this paper, we first investigate the general concepts of spatial risk measures and corresponding axioms further and thoroughly explain the usefulness of this theory for both actuarial science and practice. Second, in the case of a general cost field, we give sufficient conditions such that spatial risk measures associated with expectation, variance, value-at-risk as well as expected shortfall and induced by this cost field satisfy the axioms of asymptotic spatial homogeneity of order 0, −2, −1 and −1, respectively. Last but not least, in the case where the cost field is a function of a max-stable random field, we provide conditions on both the function and the max-stable field ensuring the latter properties. Max-stable random fields are relevant when assessing the risk of extreme events since they appear as a natural extension of multivariate extreme-value theory to the level of random fields. Overall, this paper improves our understanding of spatial risk measures as well as of their properties with respect to the space variable and generalizes many results obtained in Koch (2017).
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2019-04-17
    Description: The presented research discusses general approaches to analyze and model healthcare data at the treatment level and at the store level. The paper consists of two parts: (1) a general analysis method for store-level product sales of an organization and (2) a treatment-level analysis method of healthcare expenditures. In the first part, our goal is to develop a modeling framework to help understand the factors influencing the sales volume of stores maintained by a healthcare organization. In the second part of the paper, we demonstrate a treatment-level approach to modeling healthcare expenditures. In this part, we aim to improve the operational-level management of a healthcare provider by predicting the total cost of medical services. From this perspective, treatment-level analyses of medical expenditures may help provide a micro-level approach to predicting the total amount of expenditures for a healthcare provider. We present a model for analyzing a specific type of medical data, which may arise commonly in a healthcare provider’s standardized database. We do this by using an extension of the frequency-severity approach to modeling insurance expenditures from the actuarial science literature.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2019-04-18
    Description: Banks in India have been gone through structural changes in the last three decades. The prices that bank charge depend on the competitive levels in the banking sector and the risk the assets and liabilities carry in banks’ balance sheet. The traditional Lerner Index indicates competitive levels. However, this measure does not account for the risk, and this study introduces a risk-adjusted Lerner Index for evaluating competition in Indian banking for the period 1996 to 2016. The market power estimated through the adjusted Lerner Index has been declining since 1996, which indicates an improvement in competitive condition for the overall period. Further, as indicated by risk-adjusted Lerner Index, the Indian banking system exerts much less market power and hence are more competitive contrary to what is suggested by traditional Lerner index.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2019-04-26
    Description: In this paper, we develop a framework for measuring, allocating and managing systemic risk. SystRisk, our measure of total systemic risk, captures the a priori cost to society for providing tail-risk insurance to the financial system. Our allocation principle distributes the total systemic risk among individual institutions according to their size-shifted marginal contributions. To describe economic shocks and systemic feedback effects, we propose a reduced form stochastic model that can be calibrated to historical data. We also discuss systemic risk limits, systemic risk charges and a cap and trade system for systemic risk.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2019-05-15
    Description: Quantiles of probability distributions play a central role in the definition of risk measures (e.g., value-at-risk, conditional tail expectation) which in turn are used to capture the riskiness of the distribution tail. Estimates of risk measures are needed in many practical situations such as in pricing of extreme events, developing reserve estimates, designing risk transfer strategies, and allocating capital. In this paper, we present the empirical nonparametric and two types of parametric estimators of quantiles at various levels. For parametric estimation, we employ the maximum likelihood and percentile-matching approaches. Asymptotic distributions of all the estimators under consideration are derived when data are left-truncated and right-censored, which is a typical loss variable modification in insurance. Then, we construct relative efficiency curves (REC) for all the parametric estimators. Specific examples of such curves are provided for exponential and single-parameter Pareto distributions for a few data truncation and censoring cases. Additionally, using simulated data we examine how wrong quantile estimates can be when one makes incorrect modeling assumptions. The numerical analysis is also supplemented with standard model diagnostics and validation (e.g., quantile-quantile plots, goodness-of-fit tests, information criteria) and presents an example of when those methods can mislead the decision maker. These findings pave the way for further work on RECs with potential for them being developed into an effective diagnostic tool in this context.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2019-04-17
    Description: Territory design and analysis using geographical loss cost are a key aspect in auto insurance rate regulation. The major objective of this work is to study the design of geographical rating territories by maximizing the within-group homogeneity, as well as maximizing the among-group heterogeneity from statistical perspectives, while maximizing the actuarial equity of pure premium, as required by insurance regulation. To achieve this goal, the spatially-constrained clustering of industry level loss cost was investigated. Within this study, in order to meet the contiguity, which is a legal requirement on the design of geographical rating territories, a clustering approach based on Delaunay triangulation is proposed. Furthermore, an entropy-based approach was introduced to quantify the homogeneity of clusters, while both the elbow method and the gap statistic are used to determine the initial number of clusters. This study illustrated the usefulness of the spatially-constrained clustering approach in defining geographical rating territories for insurance rate regulation purposes. The significance of this work is to provide a new solution for better designing geographical rating territories. The proposed method can be useful for other demographical data analysis because of the similar nature of the spatial constraint.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2019-04-15
    Description: Annuities providers become more and more exposed to longevity risk due to the increase in life expectancy. To hedge this risk, new longevity derivatives have been proposed (longevity bonds, q-forwards, S-swaps…). Although academic researchers, policy makers and practitioners have talked about it for years, longevity-linked securities are not widely traded in financial markets, due in particular to the pricing difficulty. In this paper, we compare different existing pricing methods and propose a Cost of Capital approach. Our method is designed to be more consistent with Solvency II requirement (longevity risk assessment is based on a one year time horizon). The price of longevity risk is determined for a S-forward and a S-swap but can be used to price other longevity-linked securities. We also compare this Cost of capital method with some classical pricing approaches. The Hull and White and CIR extended models are used to represent the evolution of mortality over time. We use data for Belgian population to derive prices for the proposed longevity linked securities based on the different methods.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2019-05-08
    Description: We present several fast algorithms for computing the distribution of a sum of spatially dependent, discrete random variables to aggregate catastrophe risk. The algorithms are based on direct and hierarchical copula trees. Computing speed comes from the fact that loss aggregation at branching nodes is based on combination of fast approximation to brute-force convolution, arithmetization (regriding) and linear complexity of the method for computing the distribution of comonotonic sum of risks. We discuss the impact of tree topology on the second-order moments and tail statistics of the resulting distribution of the total risk. We test the performance of the presented models by accumulating ground-up loss for 29,000 risks affected by hurricane peril.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2019-03-29
    Description: In small populations, mortality rates are characterized by a great volatility, the datasets are often available for a few years and suffer from missing data. Therefore, standard mortality models may produce high uncertain and biologically improbable projections. In this paper, we deal with the mortality projections of the Maltese population, a small country with less than 500,000 inhabitants, whose data on exposures and observed deaths suffers from all the typical problems of small populations. We concentrate our analysis on older adult mortality. Starting from some recent suggestions in the literature, we assume that the mortality of a small population can be modeled starting from the mortality of a bigger one (the reference population) adding a spread. The first part of the paper is dedicated to the choice of the reference population, then we test alternative mortality models. Finally, we verify the capacity of the proposed approach to reduce the volatility of the mortality projections. The results obtained show that the model is able to significantly reduce the uncertainty of projected mortality rates and to ensure their coherent and biologically reasonable evolution.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2019-03-16
    Description: In the field of mortality, the Lee–Carter based approach can be considered the milestone to forecast mortality rates among stochastic models. We could define a “Lee–Carter model family” that embraces all developments of this model, including its first formulation (1992) that remains the benchmark for comparing the performance of future models. In the Lee–Carter model, the κ t parameter, describing the mortality trend over time, plays an important role about the future mortality behavior. The traditional ARIMA process usually used to model κ t shows evident limitations to describe the future mortality shape. Concerning forecasting phase, academics should approach a more plausible way in order to think a nonlinear shape of the projected mortality rates. Therefore, we propose an alternative approach the ARIMA processes based on a deep learning technique. More precisely, in order to catch the pattern of κ t series over time more accurately, we apply a Recurrent Neural Network with a Long Short-Term Memory architecture and integrate the Lee–Carter model to improve its predictive capacity. The proposed approach provides significant performance in terms of predictive accuracy and also allow for avoiding the time-chunks’ a priori selection. Indeed, it is a common practice among academics to delete the time in which the noise is overflowing or the data quality is insufficient. The strength of the Long Short-Term Memory network lies in its ability to treat this noise and adequately reproduce it into the forecasted trend, due to its own architecture enabling to take into account significant long-term patterns.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2019-03-06
    Description: This paper explores the stochastic collocation technique, applied on a monotonic spline, as an arbitrage-free and model-free interpolation of implied volatilities. We explore various spline formulations, including B-spline representations. We explain how to calibrate the different representations against market option prices, detail how to smooth out the market quotes, and choose a proper initial guess. The technique is then applied to concrete market options and the stability of the different approaches is analyzed. Finally, we consider a challenging example where convex spline interpolations lead to oscillations in the implied volatility and compare the spline collocation results with those obtained through arbitrage-free interpolation technique of Andreasen and Huge.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2019-03-05
    Description: In this paper, we develop a contingent claim model to evaluate the equity, default risk, and efficiency gain/loss from managerial overconfidence of a shadow-banking life insurer under the purchases of distressed assets by the government. Our paper focuses on managerial overconfidence where the chief executive officer (CEO) overestimates the returns on investment. The investment market faced by the life insurer is imperfectly competitive, and investment is core to the provision of profit-sharing life insurance policies. We show that CEO overconfidence raises the default risk in the life insurer’s equity returns, thereby adversely affecting the financial stability. Either shadow-banking involvement or government bailout attenuates the unfavorable effect. There is an efficiency gain from CEO overconfidence to investment. Government bailout helps to reduce the life insurer’s default risk, but simultaneously reduce the efficiency gain from CEO overconfidence. Our results contribute to the managerial overconfidence literature linking insurer shadow-banking involvement and government bailout in particular during a financial crisis.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2019-02-26
    Description: Estimation of future mortality rates still plays a central role among life insurers in pricing their products and managing longevity risk. In the literature on mortality modeling, a wide number of stochastic models have been proposed, most of them forecasting future mortality rates by extrapolating one or more latent factors. The abundance of proposed models shows that forecasting future mortality from historical trends is non-trivial. Following the idea proposed in Deprez et al. (2017), we use machine learning algorithms, able to catch patterns that are not commonly identifiable, to calibrate a parameter (the machine learning estimator), improving the goodness of fit of standard stochastic mortality models. The machine learning estimator is then forecasted according to the Lee-Carter framework, allowing one to obtain a higher forecasting quality of the standard stochastic models. Out-of sample forecasts are provided to verify the model accuracy.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2019-02-26
    Description: In this paper, we propose a credible regression approach with random coefficients to model and forecast the mortality dynamics of a given population with limited data. Age-specific mortality rates are modelled and extrapolation methods are utilized to estimate future mortality rates. The results on Greek mortality data indicate that credibility regression contributed to more accurate forecasts than those produced from the Lee–Carter and Cairns–Blake–Dowd models. An application on pricing insurance-related products is also provided.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    facet.materialart.
    Unknown
    Molecular Diversity Preservation International
    Publication Date: 2019-02-26
    Description: We introduce a financial stress index that was developed by the Office of Financial Research (OFR FSI) and detail its purpose, construction, interpretation, and use in financial market monitoring. The index employs a novel and flexible methodology using daily data from global financial markets. Analysis for the 2000–2018 time period is presented. Using a logistic regression framework and dates of government intervention in the financial system as a proxy for stress events, we found that the OFR FSI performs well in identifying systemic financial stress. In addition, we find that the OFR FSI leads the Chicago Fed National Activity Index in a Granger causality analysis, suggesting that increases in financial stress help predict decreases in economic activity.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2019-03-25
    Description: We study a portfolio selection problem in a continuous-time Itô–Markov additive market with prices of financial assets described by Markov additive processes that combine Lévy processes and regime switching models. Thus, the model takes into account two sources of risk: the jump diffusion risk and the regime switching risk. For this reason, the market is incomplete. We complete the market by enlarging it with the use of a set of Markovian jump securities, Markovian power-jump securities and impulse regime switching securities. Moreover, we give conditions under which the market is asymptotic-arbitrage-free. We solve the portfolio selection problem in the Itô–Markov additive market for the power utility and the logarithmic utility.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2019-02-25
    Description: Risk management is a comparatively new field and there is no core system of risk management in the construction industries of developing countries. In Pakistan, construction is an extremely risk-seeking industry lacking a good reputation for handling risk. However, it is gradually giving it more importance as a result of increased competition and construction activities. For this purpose, a survey-based study has been conducted which aims to investigate the risk management practices used in construction projects in Pakistan. To achieve the objective, data was collected from 22 contractor firms working on 100 diverse projects. The analysis indicates that risk management has been implemented at a low level in the local environment. The results also disclose that there is a higher degree of correlation between effective risk management and project success. The findings reveal the importance of risk management techniques, their usage, implication, and the effect of these techniques on the success of construction projects from the contractor’s perspective, thus convincing the key participants of projects about the use of risk management.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2019-03-05
    Description: There is an increasing influence of machine learning in business applications, with many solutions already implemented and many more being explored. Since the global financial crisis, risk management in banks has gained more prominence, and there has been a constant focus around how risks are being detected, measured, reported and managed. Considerable research in academia and industry has focused on the developments in banking and risk management and the current and emerging challenges. This paper, through a review of the available literature seeks to analyse and evaluate machine-learning techniques that have been researched in the context of banking risk management, and to identify areas or problems in risk management that have been inadequately explored and are potential areas for further research. The review has shown that the application of machine learning in the management of banking risks such as credit risk, market risk, operational risk and liquidity risk has been explored; however, it doesn’t appear commensurate with the current industry level of focus on both risk management and machine learning. A large number of areas remain in bank risk management that could significantly benefit from the study of how machine learning can be applied to address specific problems.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2019-03-11
    Description: This paper studies the optimal investment and consumption strategies in a two-asset model. A dynamic Value-at-Risk constraint is imposed to manage the wealth process. By using Value at Risk as the risk measure during the investment horizon, the decision maker can dynamically monitor the exposed risk and quantify the maximum expected loss over a finite horizon period at a given confidence level. In addition, the decision maker has to filter the key economic factors to make decisions. Considering the cost of filtering the factors, the decision maker aims to maximize the utility of consumption in a finite horizon. By using the Kalman filter, a partially observed system is converted to a completely observed one. However, due to the cost of information processing, the decision maker fails to process the information in an arbitrarily rational manner and can only make decisions on the basis of the limited observed signals. A genetic algorithm was developed to find the optimal investment, consumption strategies, and observation strength. Numerical simulation results are provided to illustrate the performance of the algorithm.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2019-02-25
    Description: Determining distributions of the functions of random variables is one of the most important problems in statistics and applied mathematics because distributions of functions have wide range of applications in numerous areas in economics, finance, risk management, science, and others. However, most studies only focus on the distribution of independent variables or focus on some common distributions such as multivariate normal joint distributions for the functions of dependent random variables. To bridge the gap in the literature, in this paper, we first derive the general formulas to determine both density and distribution of the product for two or more random variables via copulas to capture the dependence structures among the variables. We then propose an approach combining Monte Carlo algorithm, graphical approach, and numerical analysis to efficiently estimate both density and distribution. We illustrate our approach by examining the shapes and behaviors of both density and distribution of the product for two log-normal random variables on several different copulas, including Gaussian, Student-t, Clayton, Gumbel, Frank, and Joe Copulas, and estimate some common measures including Kendall’s coefficient, mean, median, standard deviation, skewness, and kurtosis for the distributions. We found that different types of copulas affect the behavior of distributions differently. In addition, we also discuss the behaviors via all copulas above with the same Kendall’s coefficient. Our results are the foundation of any further study that relies on the density and cumulative probability functions of product for two or more random variables. Thus, the theory developed in this paper is useful for academics, practitioners, and policy makers.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2019-03-08
    Description: Value at Risk (VaR) is used to illustrate the maximum potential loss under a given confidence level, and is just a single indicator to evaluate risk ignoring any information about income. The present paper will generalize one-dimensional VaR to two-dimensional VaR with income-risk double indicators. We first construct a double-VaR with ( μ , σ 2 ) (or ( μ , V a R 2 ) ) indicators, and deduce the joint confidence region of ( μ , σ 2 ) (or ( μ , V a R 2 ) ) by virtue of the two-dimensional likelihood ratio method. Finally, an example to cover the empirical analysis of two double-VaR models is stated.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2019-02-20
    Description: Based on a rich dataset of recoveries donated by a debt collection business, recovery rates for non-performing loans taken from a single European country are modelled using linear regression, linear regression with Lasso, beta regression and inflated beta regression. We also propose a two-stage model: beta mixture model combined with a logistic regression model. The proposed model allowed us to model the multimodal distribution we found for these recovery rates. All models were built using loan characteristics, default data and collections data prior to purchase by the debt collection business. The intended use of the models was to estimate future recovery rates for improved risk assessment, capital requirement calculations and bad debt management. They were compared using a range of quantitative performance measures under K-fold cross validation. Among all the models, we found that the proposed two-stage beta mixture model performs best.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2019-02-19
    Description: As is well-known, the benefit of restricting Lévy processes without positive jumps is the “ W , Z scale functions paradigm”, by which the knowledge of the scale functions W , Z extends immediately to other risk control problems. The same is true largely for strong Markov processes X t , with the notable distinctions that (a) it is more convenient to use as “basis” differential exit functions ν , δ , and that (b) it is not yet known how to compute ν , δ or W , Z beyond the Lévy, diffusion, and a few other cases. The unifying framework outlined in this paper suggests, however, via an example that the spectrally negative Markov and Lévy cases are very similar (except for the level of work involved in computing the basic functions ν , δ ). We illustrate the potential of the unified framework by introducing a new objective () for the optimization of dividends, inspired by the de Finetti problem of maximizing expected discounted cumulative dividends until ruin, where we replace ruin with an optimally chosen Azema-Yor/generalized draw-down/regret/trailing stopping time. This is defined as a hitting time of the “draw-down” process Y t = sup 0 ≤ s ≤ t X s - X t obtained by reflecting X t at its maximum. This new variational problem has been solved in a parallel paper.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2019-02-22
    Description: Extrapolative methods are one of the most commonly-adopted forecasting approaches in the literature on projecting future mortality rates. It can be argued that there are two types of mortality models using this approach. The first extracts patterns in age, time and cohort dimensions either in a deterministic fashion or a stochastic fashion. The second uses non-parametric smoothing techniques to model mortality and thus has no explicit constraints placed on the model. We argue that from a forecasting point of view, the main difference between the two types of models is whether they treat recent and historical information equally in the projection process. In this paper, we compare the forecasting performance of the two types of models using Great Britain male mortality data from 1950–2016. We also conduct a robustness test to see how sensitive the forecasts are to the changes in the length of historical data used to calibrate the models. The main conclusion from the study is that more recent information should be given more weight in the forecasting process as it has greater predictive power over historical information.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2019-02-01
    Description: Standardized longevity risk transfers often involve modeling mortality rates of multiple populations. Some researchers have found that mortality indexes of selected countries are cointegrated, meaning that a linear relationship exists between the indexes. Vector error correction model (VECM) was used to incorporate this relation, thereby forcing the mortality rates of multiple populations to revert to a long-run equilibrium. However, the long-run equilibrium may change over time. It is crucial to incorporate these changes such that mortality dependence is adequately modeled. In this paper, we develop a framework to examine the presence of equilibrium changes and to incorporate these changes into the mortality model. In particular, we focus on equilibrium changes caused by threshold effect, the phenomenon that mortality indexes alternate between different VECMs depending on the value of a threshold variable. Our framework comprises two steps. In the first step, a statistical test is performed to examine the presence of threshold effect in the VECM for multiple mortality indexes. In the second step, threshold vector error correction model (TVECM) is fitted to the mortality indexes and model adequacy is evaluated. We illustrate this framework with the mortality data of England and Wales (EW) and Canadian populations. We further apply the TVECM to forecast future mortalities and price an illustrative longevity bond with multivariate Wang transform. Our numerical results show that TVECM predicted much faster mortality improvement for EW and Canada than single-regime VECM and thus the incorporation of threshold effect significant increases longevity bond price.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2019-01-09
    Description: One of the main challenges investors have to face is model uncertainty. Typically, the dynamic of the assets is modeled using two parameters: the drift vector and the covariance matrix, which are both uncertain. Since the variance/covariance parameter is assumed to be estimated with a certain level of confidence, we focus on drift uncertainty in this paper. Building on filtering techniques and learning methods, we use a Bayesian learning approach to solve the Markowitz problem and provide a simple and practical procedure to implement optimal strategy. To illustrate the value added of using the optimal Bayesian learning strategy, we compare it with an optimal nonlearning strategy that keeps the drift constant at all times. In order to emphasize the prevalence of the Bayesian learning strategy above the nonlearning one in different situations, we experiment three different investment universes: indices of various asset classes, currencies and smart beta strategies.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2019-04-19
    Description: While the main conceptual issue related to deposit insurances is the moral hazard risk, the main technical issue is inaccurate calibration of the implied volatility. This issue can raise the risk of generating an arbitrage. In this paper, first, we discuss that by imposing the no-moral-hazard risk, the removal of arbitrage is equivalent to removing the static arbitrage. Then, we propose a simple quadratic model to parameterize implied volatility and remove the static arbitrage. The process of removing the static risk is as follows: Using a machine learning approach with a regularized cost function, we update the parameters in such a way that butterfly arbitrage is ruled out and also implementing a calibration method, we make some conditions on the parameters of each time slice to rule out calendar spread arbitrage. Therefore, eliminating the effects of both butterfly and calendar spread arbitrage make the implied volatility surface free of static arbitrage.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2019-04-14
    Description: A recently introduced accounting standard, namely the International Financial Reporting Standard 9, requires banks to build provisions based on forward-looking expected loss models. When there is a significant increase in credit risk of a loan, additional provisions must be charged to the income statement. Banks need to set for each loan a threshold defining what such a significant increase in credit risk constitutes. A low threshold allows banks to recognize credit risk early, but leads to income volatility. We introduce a statistical framework to model this trade-off between early recognition of credit risk and avoidance of excessive income volatility. We analyze the resulting optimization problem for different models, relate it to the banking stress test of the European Union, and illustrate it using default data by Standard and Poor’s.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2019-04-11
    Description: As decarbonisation progresses and conventional thermal generation gradually gives way to other technologies including intermittent renewables, there is an increasing requirement for system balancing from new and also fast-acting sources such as battery storage. In the deregulated context, this raises questions of market design and operational optimisation. In this paper, we assess the real option value of an arrangement under which an autonomous energy-limited storage unit sells incremental balancing reserve. The arrangement is akin to a perpetual American swing put option with random refraction times, where a single incremental balancing reserve action is sold at each exercise. The power used is bought in an energy imbalance market (EIM), whose price we take as a general regular one-dimensional diffusion. The storage operator’s strategy and its real option value are derived in this framework by solving the twin timing problems of when to buy power and when to sell reserve. Our results are illustrated with an operational and economic analysis using data from the German Amprion EIM.
    Electronic ISSN: 2227-9091
    Topics: Economics
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...