1,523 results on '"Expected loss"'
Search Results
52. Improving Defense Against Intelligent Adversaries
- Author
-
Cox, Louis Anthony, Jr. and Cox, Jr., Louis Anthony
- Published
- 2013
- Full Text
- View/download PDF
53. Cost of Equity for Private Companies:The Integrated Pricing Model
- Author
-
Oricchio, Gianluca and Oricchio, Gianluca
- Published
- 2012
- Full Text
- View/download PDF
54. Required Returns and the CAPM
- Author
-
Estrada, Javier and Estrada, Javier
- Published
- 2011
- Full Text
- View/download PDF
55. Term Structure Dimension
- Author
-
Schlösser, Anna, Beckmann, M., editor, Künzi, H. P., editor, Fandel, G., Managing Editors, Trockel, W., Managing Editors, Dawid, H., Series Editor, Dimitrow, D., Series Editor, Gerber, A., Series Editor, Haake, C J., Series Editor, Hofmann, C., Series Editor, Pfeiffer, T., Series Editor, Slowiński, R., Series Editor, Zijm, W. H. M., Series Editor, and Schlösser, Anna
- Published
- 2011
- Full Text
- View/download PDF
56. Bayesian Decision-Theoretic Design of Experiments Under an Alternative Model
- Author
-
Antony M. Overstall and James McGree
- Subjects
FOS: Computer and information sciences ,Statistics and Probability ,Mathematical optimization ,Computer science ,Applied Mathematics ,Design of experiments ,Bayesian probability ,Statistical model ,Basis function ,Function (mathematics) ,Methodology (stat.ME) ,Joint probability distribution ,Robustness (economics) ,Expected loss ,Statistics - Methodology - Abstract
Traditionally Bayesian decision-theoretic design of experiments proceeds by choosing a design to minimise expectation of a given loss function over the space of all designs. The loss function encapsulates the aim of the experiment, and the expectation is taken with respect to the joint distribution of all unknown quantities implied by the statistical model that will be fitted to observed responses. In this paper, an extended framework is proposed whereby the expectation of the loss is taken with respect to a joint distribution implied by an alternative statistical model. Motivation for this includes promoting robustness, ensuring computational feasibility and for allowing realistic prior specification when deriving a design. To aid in exploring the new framework, an asymptotic approximation to the expected loss under an alternative model is derived, and the properties of different loss functions are established. The framework is then demonstrated on a linear regression versus full-treatment model scenario, on estimating parameters of a non-linear model under model discrepancy and a cubic spline model under an unknown number of basis functions., Supplementary material appears as an appendix
- Published
- 2022
- Full Text
- View/download PDF
57. Enhanced Building-Specific Seismic Performance Assessment
- Author
-
Miranda, Eduardo and Fardis, Michael N., editor
- Published
- 2010
- Full Text
- View/download PDF
58. House Advantage
- Author
-
Ethier, Stewart N. and Ethier, Stewart N.
- Published
- 2010
- Full Text
- View/download PDF
59. Virtual Enterprise Transactions: A Cost Model
- Author
-
D’Atri, A., Motro, A., D'Atri, Alessandro, editor, and Saccà, Domenico, editor
- Published
- 2010
- Full Text
- View/download PDF
60. Determination of optimal calibration intervals by balancing financial exposure against measurement costs.
- Author
-
Pashnina, Nadezhda
- Subjects
- *
CALIBRATION , *FLOW measurement , *FLOW meters , *MEASURING instruments , *STOCHASTIC processes , *FINANCIAL risk - Abstract
Maintenance strategies associated with fiscal measurement systems have traditionally been based on time-based approach when calibration activities are scheduled at fixed intervals without much consideration to the previous performance of measuring instruments. The current economic challenges encourage abandoning the time-based maintenance in favour of other strategies, such as risk-based approach to maintenance. This approach is used to determine an appropriate calibration frequency by balancing financial exposure against measurement costs. There is not a universally applicable single best practice for determination of calibration intervals, this is reflected in a few officially published documents which provide only a general guidance. Addressing the need for better understanding of the mechanism required for determination of the optimal calibration intervals the detailed calculation algorithm suitable for practical use is developed based on the general guidance on risk-based approach to maintenance. The calculation algorithm is based on statistical analysis of calibration data of an individual measuring instrument which defines the progressive change of a measurement error as a nonstationary random process. The measurement error together with the selected loss function defines the financial exposure which forms the ‘total cost’ function by summation with the measurement costs, determined as the cost of ownership of the measurement instrument. A minimum of the total costs function determines the optimal calibration interval of the measuring instrument. The case study of the ultrasonic flow meters, operating in the custody-transfer gas flow measurement system, are used to illustrate the calculation algorithm. It has been shown that the determination of calibration intervals is a complex mathematical and statistical process requiring accurate and sufficient data including calibration results and registered costs. Application of the suggested calculation algorithm can be beneficial in assessment and minimisation of financial risks associated with currently implemented maintenance strategies as well as in review of calibration intervals to balance financial exposure and measurement costs. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
61. Management of Security Risks – A Controlling Model for Banking Companies
- Author
-
Faisst, Ulrich, Prokein, Oliver, Seese, Detlef, editor, Weinhardt, Christof, editor, and Schlottmann, Frank, editor
- Published
- 2008
- Full Text
- View/download PDF
62. Optimal Control Strategies for Incoming Inspection
- Author
-
Nickel, Stefan, Velten, Sebastian, Ziegler, Hans-Peter, Kalcsics, Jörg, editor, and Nickel, Stefan, editor
- Published
- 2008
- Full Text
- View/download PDF
63. Credit Risk Management
- Author
-
Franke, Jürgen, Härdle, Wolfgang K., and Hafner, Christian M.
- Published
- 2008
- Full Text
- View/download PDF
64. The Development of Expected-Loss Methods of Accounting for Credit Losses: A Review with Analysis of Comment Letters
- Author
-
Noor Hashim, Weijia Li, and John O'Hanlon
- Subjects
business.industry ,Accounting ,Financial crisis ,Economics ,Convergence (relationship) ,business ,Expected loss - Abstract
SYNOPSIS After the financial crisis of the late 2000s, concern about delayed credit-loss recognition under the incurred-loss method prompted the FASB and the IASB to develop expected-loss methods. We review the development of these methods, including through comment-letter analysis. Initially, the FASB recommended immediate full recognition of expected losses, including at day one, and the IASB recommended spreading the recognition of initially expected losses across time. After unsuccessful attempts to converge based on proposals that partly reflected initial recommendations of each board, the boards eventually adopted different methods. We report that U.S. respondents largely opposed the FASB's final method, which required day-one recognition of all expected losses, and that non-U.S. respondents largely supported the IASB's final method, which required day-one recognition of 12-month expected losses. Day-one loss was controversial and impeded convergence. Our comment-letter analysis suggests that a day-one-loss-free more forward-looking incurred-loss method might provide a route to a more converged solution. JEL Classifications: G21; M41.
- Published
- 2021
- Full Text
- View/download PDF
65. Impact of Covid-19 on SME portfolios in Morocco: Evaluation of banking risk costs and the effectiveness of state support measures
- Author
-
Salim El Haddad and Mohamed El Habachi
- Subjects
Economics and Econometrics ,Government ,Index (economics) ,business.industry ,Strategy and Management ,Failure rate ,State (functional analysis) ,probability of default ,Linear discriminant analysis ,Logistic regression ,expected loss ,expert opinion ,HG1-9999 ,Econometrics ,internal rating ,Business ,Business and International Management ,Expected loss ,Publication ,Finance - Abstract
This study proposed a method for constructing rating tools using logistic regression and linear discriminant analysis to determine the risk profile of SME portfolios. The objective, firstly, is to evaluate the impact of the crisis due to the Covid-19 by readjusting the profile of each company by using the expert opinion and, secondly, to evaluate the efficiency of the measures taken by the Moroccan state to support the companies during the period of the pandemic. The analysis in this paper showed that the performance of the logistic regression and linear discriminant analysis models is almost equivalent based on the ROC curve. However, it was revealed that the logistic regression model minimizes the risk cost represented in this study by the expected loss. For the support measures adopted by the Moroccan government, the study showed that the failure rate (critical situation) of the firms benefiting from the support is largely lower than that of the non-beneficiaries. © 2021 LLC CPC Business Perspectives. All rights reserved.
- Published
- 2021
66. Ludens: A Gambling Addiction Prevention Program Based on the Principles of Ethical Gambling
- Author
-
Francisco Bueno, Mariano Chóliz, and Marta Marcos
- Subjects
Adult ,Male ,medicine.medical_specialty ,Adolescent ,Sociology and Political Science ,media_common.quotation_subject ,Gambling disorder ,Prevention program ,Young Adult ,Risk-Taking ,medicine ,Humans ,General Psychology ,media_common ,Original Paper ,Schools ,Health consequences ,Public health ,Addiction ,Universal prevention ,Addiction prevention ,Ethical gambling ,Economic benefits ,Adolescence ,Behavior, Addictive ,Diagnostic and Statistical Manual of Mental Disorders ,Gambling ,Female ,Psychology ,Expected loss ,Clinical psychology - Abstract
Gambling is legal in most countries. However, despite having some economic benefits, certain characteristics of gambling can have health consequences, rendering it a public health issue. The effects can be summarized according to the following three “laws” of ethical gambling: “Gambling Dynamics Law”: companies’ economic gains come directly from players’ losses; “Expected Loss Law”: the more one gambles, the greater the probability of losing; and “Addiction Law”: the more one gambles, the greater the need to play again, leading to further losses. Ludens is a gambling addiction prevention program that has four goals: inform participants about gambling and gambling addiction; sensitize participants to the risk of gambling for health, especially addiction; promote a change in attitudes toward gambling; and alert participants to risky behaviors that can lead to addiction. The prevention program was implemented during 2017 to 2019. Fourteen psychologists presented it to 2372 adolescents (48.8% females, 51.2% males) aged 14–19 years, none of whom were university students, recruited from 42 Spanish high schools in 132 groups taking different courses. The main dependent variables analyzed were the monthly frequencies of gambling, at-risk gambling, and gambling addiction (as measured by the National Opinion Research Center DSM-IV Screen for Gambling Problems, adapted to diagnose gambling disorder according to DSM-5, in which pathological gambling is considered an addictive disorder). Given that all of the gamblers were adolescents (most were minors), fulfilment of 1–3 the DSM-5 diagnostic criteria was considered to indicate a risk of problem gambling. After the administration of Ludens, statistically significant reductions were observed in the three variables of interest: monthly frequency of gambling, percentage of adolescents with risky gambling, and percentage of adolescents with gambling disorder. The results were analyzed according to sex and age (minors vs. adolescents between 18 and 19 years old). The results obtained after applying the prevention program indicate that Ludens is effective as a universal prevention program for gambling addiction.
- Published
- 2021
- Full Text
- View/download PDF
67. Calculating lifetime expected loss for IFRS 9: which formula is measuring what?
- Author
-
Bernd Engelmann
- Subjects
050208 finance ,IFRS 9 ,Floating interest rate ,05 social sciences ,Risk management information systems ,050201 accounting ,Prepayment of loan ,Mathematical proof ,Loan ,0502 economics and business ,Econometrics ,Economics ,Cash flow ,Expected loss ,Finance - Abstract
PurposeThe purpose of this article is to derive formulas for lifetime expected credit loss of loans that are required for the calculation of loan loss reserves under IFRS 9. This is done both for fixed-rate and floating rate loans under different assumptions on LGD modeling, prepayment, and discount rates.Design/methodology/approachThis study provides exact formulas for lifetime expected credit loss derived analytically together with the mathematical proofs of each expression.FindingsThis articles shows that the formula most commonly applied in the literature for calculating lifetime expected credit loss is inconsistent with measuring expected loss based on expected discounted cash flows. Formulas based on discounted cash flows always lead to more conservative numbers.Practical implicationsFor banks reporting under IFRS 9, the implication of this research is a better understanding of the different approaches used for computing lifetime expected loss, how they are connected, and what assumptions are underlying each approach. This may lead to corrections in existing frameworks to make applications of risk management systems more consistent.Originality/valueWhile there is a lot of literature explaining IFRS 9 and evaluating its impact, none of the existing research has systematically analyzed the calculation of lifetime expected credit loss for this purpose and how the formula changes under different modeling assumptions. This gap is filled by this study.
- Published
- 2021
- Full Text
- View/download PDF
68. Understanding the impact of Covid-19 on Indian tourism sector through time series modelling
- Author
-
Md Ozair Arshad, Shahbaz Khan, Abid Haleem, Hannan Mansoor, Md Osaid Arshad, and Md Ekrama Arshad
- Subjects
Financial economics ,business.industry ,media_common.quotation_subject ,05 social sciences ,Geography, Planning and Development ,Globe ,Management, Monitoring, Policy and Law ,medicine.anatomical_structure ,Work (electrical) ,Originality ,Hospitality ,Tourism, Leisure and Hospitality Management ,0502 economics and business ,Economics ,medicine ,Autoregressive integrated moving average ,Robustness (economics) ,business ,Expected loss ,050203 business & management ,050212 sport, leisure & tourism ,Tourism ,Nature and Landscape Conservation ,media_common - Abstract
PurposeCovid-19 pandemic is a unique and extraordinary situation for the globe, which has potentially disrupted almost all aspects of life. In this global crisis, the tourism and hospitality sector has collapsed in almost all parts of the world, and the same is true for India. Therefore, this paper aims to investigate the impact of Covid-19 on the Indian tourism industry.Design/methodology/approachThis study develops an appropriate model to forecast the expected loss of foreign tourist arrivals (FTAs) in India for 10 months. Since the FTAs follow a seasonal trend, seasonal autoregressive integrated moving average (SARIMA) method has been employed to forecast the expected FTAs in India from March 2020 to December 2020. The results of the proposed model are then compared with the ones obtained by Holt-Winter's (H-W) model to check the robustness of the proposed model.FindingsThe SARIMA model seeks to manifest the monthly arrival of foreign tourists and also elaborates on the progressing expected loss of foreign tourists arrive for the next three quarters is approximately 2 million, 2.3 million and 3.2 million, respectively. Thus, in the next three quarters, there will be an enormous downfall of FTAs, and there is a need to adopt appropriate measures. The comparison demonstrates that SARIMA is a better model than H-W model.Originality/valueSeveral studies have been reported on pandemic-affected tourism sectors using different techniques. The earlier pandemic outbreak was controlled and region-specific, but the Covid-19 eruption is a global threat having potential ramifications and strong spreading power. This work is one of the first attempts to study and analyse the impact of Covid-19 on FTAs in India.
- Published
- 2021
- Full Text
- View/download PDF
69. Three-way Decision Models of Cognitive Computing in Pythagorean Fuzzy Environments
- Author
-
Tao Feng, Ju-Sheng Mi, Pin Sun, and Shao-Pu Zhang
- Subjects
Computer science ,business.industry ,Cognitive Neuroscience ,Decision theory ,Cognitive computing ,Fuzzy set ,Interval (mathematics) ,Machine learning ,computer.software_genre ,Fuzzy logic ,Computer Science Applications ,Computer Vision and Pattern Recognition ,Artificial intelligence ,Decision-making ,business ,Expected loss ,computer ,Decision model - Abstract
Loss functions, commonly believed to be the cost of cognitive computing, are a key element in decision-making, and three-way decisions can be regarded as a cognitive computing method that seeks to minimize the overall risks involved in the decision-making process. Recently, many studies on loss functions have been conducted based on fuzzy sets, intuitionistic fuzzy sets, and interval intuitionistic fuzzy sets. However, most of these studies draw conclusions based on two descriptions, which may fail to capture the whole picture of decision-making. In this paper, in order to improve the accuracy of decision-making, we propose loss functions based on three descriptions, adding a hesitation description to the Pythagorean fuzzy environment. Then, we redefine the expected loss functions, which allow people to make a decision with more uncertainty. Subsequently, on the basis of the Bayesian minimum risk decision theory, four strategies for dealing with expected losses are proposed, and three-way decision models are established. Finally, group decision models are discussed. Three-way decision models of real value loss functions and Pythagorean fuzzy loss functions based on three descriptions are proposed, and data analyses of different parameters show the feasibility of the three-way decision models.
- Published
- 2021
- Full Text
- View/download PDF
70. Optimization of Collateral Allocation for Corporate Loans : A nonlinear network problem minimizing the expected loss in case of default
- Author
-
Grägg, Sofia, Isacson, Paula, Grägg, Sofia, and Isacson, Paula
- Abstract
Collateral management has become an increasingly valuable aspect of credit risk. Managing collaterals and constructing accurate models for decision making can give any lender a competitive advantage and decrease overall risks. This thesis explores how to allocate securities on a set of loans such that the total expected loss is minimized in case of default. A nonlinear optimization problem is formulated and several factors that affect the expected loss are considered. In order to incorporate regulations on collateral allocation, the model is formulated as a network problem. Finally, to account for the risk of the portfolio of securities, the Markowitz approach to variance is adopted. The model calculates a loss that is less than the average historical loss for the same type of portfolio. In the case of the network problem with many-to-many relations, an equal or higher expected loss is concluded. Similarly, when the variance constraint is included, the expected loss increases. This is due to some solutions are limited when removing links and including the variance constraint. The optimization problem is forced to choose a less optimal solution. The model created has no limits on the amount of collateral types and loans that can be included. An improvement of the model is to account for the stochasticity of the collateral values and the difficulties in validating the results. The latter is a consequence of the expected loss functions being based on a theoretical analysis. Nonetheless, the results of the model can act as an upper bound on expected loss, with a certain variance, since the average of the expected loss lies above the average of the historical loss., Bankers hantering av säkerheter kopplade till lån har blivit en allt viktigare aspekt inom kreditrisk. Att hantera säkerheter på ett effektivt sätt och konstruera exakta modeller som ligger till grund för beslutsfatande kan ge långivare konkurrensfördelar och minska deras risktagande. Den här uppsatsen undersöker hur säkerheter kan allokeras på ett flertal lån så att den förväntade förlusten blir så liten som möjligt vid fallissemang från låntagarens sida. Detta görs med hjälp av modellering av ett ickelinjärt optimeringsproblem som inkluderar ett flertal av de faktorer som påverkar den förväntade förlusten. För att kunna ta hänsyn till de begränsningar som finns gällande säkerhetsallokering på lån så anpassas modellen till ett nätverksproblem. Slutligen införs ett bivillkor i modellen för att begränsa portföljens varians vid beräkning av den förväntade förlusten. Detta görs med hjälp av Markowitz portföljteori. Modellen visar på en mindre förväntad förlust jämfört med den genomsnittliga historiska förlusten för samma uppsättning av portfölj. Problemet med många-till-många relationer är att de ger i genomsnitt en högre förväntad förlust jämfört med alla-till-alla relationer. Detsamma gäller för optimeringsproblemet som inkluderar Markowitz variansvillkor. Detta beror på att vissa möjliga lösningar tas bort från problemet när vissa kopplingar exkluderas och när problemet begränsas av variansvillkoret. Den framtagna modellen har inga begränsningar gällande hur många säkerher och lån som kan inkluderas. För att förbättra modellen så bör stokasticiteten av säkerheternas värde tas hänsyn till. Dessutom är resultaten svåra att validera på grund av att förlustfunktionerna är baserade på teoretisk analys. Modellen anses dock kunna agera som en övre gräns på den förväntade förlusten i och med att den genomsnittliga beräknade förlusten ligger över den historiska.
- Published
- 2022
71. Analysis of Facility Systems’ Reliability When Subject to Attack or a Natural Disaster
- Author
-
Church, Richard, Scaparra, M. Paola, Fischer, Manfred M., editor, Hewings, Geoffrey J. D., editor, Nijkamp, Peter, editor, Snickars, Folke, editor, Murray, Alan T., editor, and Grubesic, Tony H., editor
- Published
- 2007
- Full Text
- View/download PDF
72. Optimal Portfolios Under Bounded Shortfall Risk and Partial Information
- Author
-
Wunderlich, Ralf, Sass, Jörn, Gabih, Abdelali, Waldmann, Karl-Heinz, editor, and Stocker, Ulrike M., editor
- Published
- 2007
- Full Text
- View/download PDF
73. Development of a New Corporate Evaluation Approach for Banks
- Author
-
Reuse, Svend
- Published
- 2007
- Full Text
- View/download PDF
74. Decision Analysis and Small Area Estimates
- Author
-
Alho, Juha M. and Spencer, Bruce D.
- Published
- 2005
- Full Text
- View/download PDF
75. An Intelligent Agent-Based Framework for Collaborative Information Security
- Author
-
Kuo, M. H., Hutchison, David, editor, Kanade, Takeo, editor, Kittler, Josef, editor, Kleinberg, Jon M., editor, Mattern, Friedemann, editor, Mitchell, John C., editor, Naor, Moni, editor, Nierstrasz, Oscar, editor, Pandu Rangan, C., editor, Steffen, Bernhard, editor, Sudan, Madhu, editor, Terzopoulos, Demetri, editor, Tygar, Dough, editor, Vardi, Moshe Y., editor, Weikum, Gerhard, editor, Carbonell, Jaime G., editor, Siekmann, Jörg, editor, Zhang, Shichao, editor, and Jarvis, Ray, editor
- Published
- 2005
- Full Text
- View/download PDF
76. PU Active Learning for Recommender Systems
- Author
-
Jia-Jia Cai, Sheng-Jun Huang, Yuan Jiang, and Jia-Lue Chen
- Subjects
Learning problem ,Computer Networks and Communications ,business.industry ,Computer science ,Active learning (machine learning) ,General Neuroscience ,Selection strategy ,Computational intelligence ,Recommender system ,Machine learning ,computer.software_genre ,Task (project management) ,Artificial Intelligence ,Multiple criteria ,Artificial intelligence ,business ,Expected loss ,computer ,Software - Abstract
In recommender systems, supervised information is usually obtained from the historical data of users. For example, if a user watched a movie, then the user-movie pair will be marked as positive. On the other hand, the user-movie pairs did not appear in the historical data could be either positive or negative. This phenomenon motivates us to formalize the recommender task as a Positive Unlabeled learning problem. As the model trained on the biased historical data may not generalize well on future data, we propose an active learning approach to improve the model by querying further labels from the unlabeled data pool. With the target of querying as few instances as possible, an active selection strategy is proposed to minimize the expected loss and match the distribution between labeled and unlabeled data. Experiments are performed on both classification datasets and movie recommendation dataset. Results demonstrate that the proposed approach can significantly reduce the labeling cost while achieving superior performance regarding multiple criteria.
- Published
- 2021
- Full Text
- View/download PDF
77. How to estimate expected credit losses – ECL – for provisioning under IFRS 9
- Author
-
Mariya Gubareva
- Subjects
050208 finance ,Credit default swap ,IFRS 9 ,Bond ,Yield (finance) ,05 social sciences ,050201 accounting ,Investment (macroeconomics) ,0502 economics and business ,Econometrics ,Business ,Expected loss ,Finance ,Credit risk ,Valuation (finance) - Abstract
PurposeThis paper provides an objective approach based on available market information capable of reducing subjectivity, inherently present in the process of expected loss provisioning under the IFRS 9.Design/methodology/approachThis paper develops the two-step methodology. Calibrating the Credit Default Swap (CDS)-implied default probabilities to the through-the-cycle default frequencies provides average weights of default component in the spread for each forward term. Then, the impairment provisions are calculated for a sample of investment grade and high yield obligors by distilling their pure default-risk term-structures from the respective term-structures of spreads. This research demonstrates how to estimate credit impairment allowances compliant with IFRS 9 framework.FindingsThis study finds that for both investment grade and high yield exposures, the weights of default component in the credit spreads always remain inferior to 33%. The research's outcomes contrast with several previous results stating that the default risk premium accounts at least for 40% of CDS spreads. The proposed methodology is applied to calculate IFRS 9 compliant provisions for a sample of investment grade and high yield obligors.Research limitations/implicationsMany issuers are not covered by individual Bloomberg valuation curves. However, the way to overcome this limitation is proposed.Practical implicationsThe proposed approach offers a clue for a better alignment of accounting practices, financial regulation and credit risk management, using expected loss metrics across diverse silos inside organizations. It encourages adopting the proposed methodology, illustrating its application to a set of bond exposures.Originality/valueNo previous research addresses impairment provisioning employing Bloomberg valuation curves. The study fills this gap.
- Published
- 2021
- Full Text
- View/download PDF
78. Covid-19 Death Risk Estimation Using VaR Method
- Author
-
Agnieszka Surowiec and Tomasz Warowny
- Subjects
Rate of return ,Risk management -- Mathematical models ,Time horizon ,Mortality -- Statistics ,General Business, Management and Accounting ,Normal distribution ,Kurtosis ,Econometrics ,Portfolio ,Volatility (finance) ,General Economics, Econometrics and Finance ,Expected loss ,COVID-19 (Disease) -- Mortality ,Value at risk ,Mathematics - Abstract
PURPOSE: The purpose of this paper is to show that the Value at Risk (VaR) method can be used to estimate the death rate from Covid-19 infection., DESIGN/METHODOLOGY/APPROACH: The VaR method allows for risk measurements and estimations of the highest expected loss on a portfolio at an assumed confidence level over a specified time horizon. The most important assumption affecting the calculation method is that price changes in financial markets follow a normal distribution., FINDINGS: It appears that by appropriately re-defining the concepts of assets and portfolio rates of return, we can describe the volatility in the numbers of deaths caused by Covid-19. We also confirmed using the Shapiro-Wilk and Skewness and Kurtosis tests that the rates of return distribution for the death numbers follow a normal distribution., PRACTICAL IMPLICATIONS: The VaR method allows to estimate the number of deaths based on current trends which can be utilised to better manage available resources in order to reduce casualties. We use the data regarding the number of deaths in the Visegrad Group (V4) countries as a case study to test the effectiveness and accuracy of the VaR method in a different, non-financial domain., ORIGINALITY/VALUE: The theory we used in this paper is currently mainly applied to financial investments. We use this theory to describe social phenomenon which is the number of deaths, our approach has not been seen in the literature so far., peer-reviewed
- Published
- 2021
- Full Text
- View/download PDF
79. Examining omitted variable bias in anchoring premium estimates: Evidence based on assessed value
- Author
-
Ran Lu-Andrews, John M. Clapp, and Tingyu Zhou
- Subjects
Economics and Econometrics ,050208 finance ,media_common.quotation_subject ,05 social sciences ,Anchoring ,Omitted-variable bias ,Residual ,Accounting ,Loss aversion ,0502 economics and business ,Value (economics) ,Econometrics ,Economics ,Quality (business) ,050207 economics ,Market value ,Expected loss ,Finance ,media_common - Abstract
We propose a new assessed value approach to control for the amount of persistent unobserved quality. We apply our approach to a well-established two-stage framework developed by Genesove and Mayer (GM, 2001), who test the effect of an expected loss on final transaction prices in the housing market. We show that our assessed value model effectively mitigates the omitted variable bias and produces similar results as GM when the first-stage residual is included. Importantly, our model does not rely on repeat sales and therefore provides a powerful new tool for estimating market value. Results are robust to alternative specifications, to controlling for loan-to-value ratios, to replacing final sale price with listing price, to alternative fixed effects, to subperiods, to different holding periods, to simulated quality, to excluding flippers, and to controlling improvements between sales.
- Published
- 2021
- Full Text
- View/download PDF
80. Acceptance of mobile loyalty cards in the German B2C consumer goods market
- Author
-
Sandra Schneider
- Subjects
Customer retention ,HF5001-6182 ,media_common.quotation_subject ,Control (management) ,m31 ,m37 ,Structural equation modeling ,German ,Competition (economics) ,tam2 ,0502 economics and business ,Loyalty ,Business ,Marketing ,media_common ,05 social sciences ,language.human_language ,Work (electrical) ,language ,mobile loyalty cards ,050211 marketing ,d12 ,Expected loss ,050203 business & management ,acceptance - Abstract
The consumer goods market is characterized by strong competition. Thus, to bind customers to the company, a higher priority needs to be attributed to customer retention measures. Such measures include the loyalty card, but the use of the physical card is declining. To counteract this decline, mobile loyalty cards were developed. The basis for the use of mobile loyalty cards is sufficient consumer acceptance. This work is expected to contribute to the explanation of acceptance in the form of usage behavior. Based on the Technology-Acceptance-Model-2 (TAM2) and the literature, hypotheses were derived and a research model was developed. For model testing, a dataset of 255 participants was generated through an online survey and analyzed using partial least squares structural equation modeling (PLS-SEM). The results show that in addition to financial benefits, convenience benefits and psychological factors also have an influence on acceptance. Furthermore, the usage behavior is not negatively influenced by the expected loss of control over personal data. Based on the findings, indications for marketing implementation are given for the confirmed factors.
- Published
- 2021
81. Time-Dependent Probabilistic Approach of Failure Mode and Effect Analysis
- Author
-
Hyeon-ae Jang and Seungsik Min
- Subjects
fmea ,td-fmea ,rpn ,rpm ,expected loss ,risk-evaluation ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
Failure mode and effect analysis (FMEA) is one of the most widely employed pre-evaluation techniques to avoid risks that may occur during product design and manufacturing phases. However, use of the risk priority number (RPN) in traditional FMEA results in difficulties being encountered with regard to quantification of the degree of risk involved. This study proposes the use of a probabilistic time-dependent FMEA (TD-FMEA) approach to overcome limitations encountered during implementation of traditional FMEA approaches. To this end, the proposed method defines the risk priority metric (RPM) as a priority decision value. RPM corresponds to the product of the expected loss and occurrence rate of the failure-cause. By assuming exponential and case functions for each occurrence and detection time instant, the expected loss corresponding to each failure-cause can be evaluated. Utility of the proposed approach has been described in the light of results obtained via its implementation during an automotive-manufacturing case study performed for the purpose of illustration.
- Published
- 2019
- Full Text
- View/download PDF
82. Time-Dependent Probabilistic Model for Hierarchical Structure in Failure Mode and Effect Analysis
- Author
-
Hyeon-ae Jang and Seungsik Min
- Subjects
fmea ,htd-fmea ,hierarchical structure ,time-dependent probabilistic loss model ,expected loss ,risk evaluation ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
Failure mode and effect analysis (FMEA) is a structured technique for identifying risks that may occur during a given stage of a system’s life cycle. However, the use of the risk priority number (RPN) in traditional FMEA results in difficulties with regard to quantification of the degree of risk in the hierarchical failure structure. This study proposes the use of a hierarchical time-dependent FMEA approach to overcome the limitations encountered during the implementation of traditional FMEA approaches. In place of the RPN, a probabilistic loss model is developed under a hierarchical structure considering the elapsed time from the failure-cause (FC) to the system failure. By assuming exponential and case functions for each occurrence and detection time instant, the expected loss corresponding to each FC can be evaluated. As a result of the practical application of the time-dependent probabilistic model through the numerical example, we could reasonably evaluate the risk from the cause of failure in the hierarchical structure in terms of economic loss.
- Published
- 2019
- Full Text
- View/download PDF
83. The Economics of Information Security Investment
- Author
-
Gordon, Lawrence A., Loeb, Martin P., Jajodia, Sushil, editor, Camp, L. Jean, editor, and Lewis, Stephen, editor
- Published
- 2004
- Full Text
- View/download PDF
84. Supply, Demand and Regulation of Catastrophe Insurance
- Author
-
Grace, Martin F., Klein, Robert W., Kleindorfer, Paul R., Murray, Michael R., Crew, Michael A., editor, Grace, Martin F., Klein, Robert W., Kleindorfer, Paul R., and Murray, Michael R.
- Published
- 2003
- Full Text
- View/download PDF
85. Valuation of Certificate of Deposit (CD) With Transfer Option
- Author
-
Kariya, Takeaki, Liu, Regina Y., Kariya, Takeaki, and Liu, Regina Y.
- Published
- 2003
- Full Text
- View/download PDF
86. Logistic retainment interval dose exploration design for Phase I clinical trials of cytotoxic agents
- Author
-
Thomas A. Murray
- Subjects
Statistics and Probability ,Maximum Tolerated Dose ,Bayesian probability ,Antineoplastic Agents ,Interval (mathematics) ,Logistic regression ,01 natural sciences ,Article ,010104 statistics & probability ,03 medical and health sciences ,0302 clinical medicine ,Statistics ,Humans ,Medicine ,Computer Simulation ,Pharmacology (medical) ,030212 general & internal medicine ,Dosing ,0101 mathematics ,Probability ,Pharmacology ,Clinical Trials, Phase I as Topic ,Dose-Response Relationship, Drug ,Cytotoxins ,business.industry ,Bayes Theorem ,Odds ratio ,Clinical trial ,Research Design ,Cohort ,business ,Expected loss - Abstract
Phase I studies of a cytotoxic agent often aim to identify the dose that provides an investigator specified target dose-limiting toxicity (DLT) probability. In practice, an initial cohort receives a dose with a putative low DLT probability, and subsequent dosing follows by consecutively deciding whether to retain the current dose, escalate to the adjacent higher dose, or de-escalate to the adjacent lower dose. This article proposes a Phase I design derived using a Bayesian decision-theoretic approach to this sequential decision-making process. The design consecutively chooses the action that minimizes posterior expected loss where the loss reflects the distance on the log-odds scale between the target and the DLT probability of the dose that would be given to the next cohort under the corresponding action. A logistic model is assumed for the log odds of a DLT at the current dose with a weakly informative t-distribution prior centered at the target. The key design parameters are the pre-specified odds ratios for the DLT probabilities at the adjacent higher and lower doses. Dosing rules may be pre-tabulated, as these only depend on the outcomes at the current dose, which greatly facilitates implementation. The recommended default version of the proposed design improves dose selection relative to many established designs across a variety of scenarios.
- Published
- 2021
- Full Text
- View/download PDF
87. Unrecognized Expected Credit Losses and Bank Share Prices
- Author
-
Barrett Wheeler
- Subjects
Economics and Econometrics ,050208 finance ,education ,05 social sciences ,050201 accounting ,Monetary economics ,Negatively associated ,Accounting ,0502 economics and business ,Capital requirement ,Business ,Expected loss ,health care economics and organizations ,Finance ,Stock (geology) ,Credit risk - Abstract
Accounting for credit losses under U.S. GAAP is transitioning from an incurred to an expected loss model. The model change was motivated by concerns that reporting only incurred losses does not provide investors with sufficient and timely information about banks’ credit risk. In this paper, I develop a measure of lifetime expected credit losses using vintage analysis to examine whether stock prices reflect information about unrecognized expected credit losses in an incurred loss regime. Consistent with investors being able to obtain information about expected losses that are not recognized in the financial statements, I find that unrecognized expected credit losses are negatively associated with bank stock prices. The pricing of these losses is stronger for larger banks, consistent with lower costs of obtaining this information for banks with better information environments. I also find that recorded allowances were less than estimated expected losses, on average, consistent with concerns that implementing the expected loss model will adversely impact regulatory capital adequacy.
- Published
- 2021
- Full Text
- View/download PDF
88. Worst Expected Best method for assessment of probabilistic network expected value at risk: application in supply chain risk management
- Author
-
Abroon Qazi and Mecit Can Emre Simsekler
- Subjects
Supply chain risk management ,021103 operations research ,Computer science ,Strategy and Management ,Supply chain ,05 social sciences ,0211 other engineering and technologies ,Probabilistic logic ,Bayesian network ,02 engineering and technology ,General Business, Management and Accounting ,Risk appetite ,Risk analysis (engineering) ,0502 economics and business ,Expected loss ,050203 business & management ,Value at risk ,Vulnerability (computing) - Abstract
PurposeThe purpose of this paper is to develop and operationalize a process for prioritizing supply chain risks that is capable of capturing the value at risk (VaR), the maximum loss expected at a given confidence level for a specified timeframe associated with risks within a network setting.Design/methodology/approachThe proposed “Worst Expected Best” method is theoretically grounded in the framework of Bayesian Belief Networks (BBNs), which is considered an effective technique for modeling interdependency across uncertain variables. An algorithm is developed to operationalize the proposed method, which is demonstrated using a simulation model.FindingsPoint estimate-based methods used for aggregating the network expected loss for a given supply chain risk network are unable to project the realistic risk exposure associated with a supply chain. The proposed method helps in establishing the expected network-wide loss for a given confidence level. The vulnerability and resilience-based risk prioritization schemes for the model considered in this paper have a very weak correlation.Originality/valueThis paper introduces a new “Worst Expected Best” method to the literature on supply chain risk management that helps in assessing the probabilistic network expected VaR for a given supply chain risk network. Further, new risk metrics are proposed to prioritize risks relative to a specific VaR that reflects the decision-maker's risk appetite.
- Published
- 2021
- Full Text
- View/download PDF
89. U.S. Interstate Trade Will Mitigate the Negative Impact of Climate Change on Crop Profit
- Author
-
Zhangliang Chen, Sandy Dall'Erba, and Noé J. Nava
- Subjects
International level ,Economics and Econometrics ,business.industry ,Natural resource economics ,Climate change ,Agricultural and Biological Sciences (miscellaneous) ,Profit (economics) ,Crop ,Agriculture ,Climate impact ,Weather data ,Economics ,business ,Expected loss - Abstract
According to the current Intergovernmental Panel of Climate Change report, climate change will increase the probability of occurrence of droughts in some areas. Recent contributions at the international level indicate that trade is expected to act as an efficient tool to mitigate the adverse effect of future climate conditions on agriculture. However, no contribution has focused on the similar capacity of trade within any country yet. The U.S. is an obvious choice given that many climate impact studies focus on its agriculture and around 90% of the U.S. crop trade is domestic. Combining a recent state‐to‐state trade flow dataset with detailed drought records at a fine spatial and temporal resolution, this paper highlights first that trade increases as the destination state experiences more drought and inversely in the origin state. As a result, crop growers' profits depend on both local and trade partners' weather conditions. Projections based on future weather data convert the crop grower's expected loss without trade into expected profit. As such, this paper challenges the estimates of the current climate impact literature and concludes that trade is expected to act as a $14.5 billion mitigation tool in the near future.
- Published
- 2021
- Full Text
- View/download PDF
90. Minimax Efficient Random Experimental Design Strategies With Application to Model-Robust Design for Prediction
- Author
-
Timothy W. Waite and David C. Woods
- Subjects
Statistics and Probability ,Mathematical optimization ,Randomization ,Computer science ,05 social sciences ,Minimax ,01 natural sciences ,010104 statistics & probability ,Robust design ,Decision strategy ,0502 economics and business ,0101 mathematics ,Statistics, Probability and Uncertainty ,Statistical decision theory ,Expected loss ,Game theory ,050205 econometrics - Abstract
In game theory and statistical decision theory, a random (i.e., mixed) decision strategy often outperforms a deterministic strategy in minimax expected loss. As experimental design can be viewed as a game pitting the Statistician against Nature, the use of a random strategy to choose a design will often be beneficial. However, the topic of minimax-efficient random strategies for design selection is mostly unexplored, with consideration limited to Fisherian randomization of the allocation of a predetermined set of treatments to experimental units. Here, for the first time, novel and more flexible random design strategies are shown to have better properties than their deterministic counterparts in linear model estimation and prediction, including stronger bounds on both the expectation and survivor function of the loss distribution. Design strategies are considered for three important statistical problems: (i) parameter estimation in linear potential outcomes models, (ii) point prediction from a correct linear model, and (iii) global prediction from a linear model taking into account an L 2-class of possible model discrepancy functions. The new random design strategies proposed for (iii) give a finite bound on the expected loss, a dramatic improvement compared to existing deterministic exact designs for which the expected loss is unbounded. Supplementary materials for this article are available online.
- Published
- 2021
- Full Text
- View/download PDF
91. Expected loss utility for natural hazards and its application in pricing property insurance products
- Author
-
Mei Cai, Guo Wei, and Wenfei Xiu
- Subjects
Actuarial science ,Cumulative prospect theory ,Computer science ,media_common.quotation_subject ,0208 environmental biotechnology ,02 engineering and technology ,010501 environmental sciences ,01 natural sciences ,020801 environmental engineering ,Property insurance ,Natural hazard ,Risk assessment ,Preference relation ,Function (engineering) ,Expected loss ,Expected utility hypothesis ,0105 earth and related environmental sciences ,General Environmental Science ,media_common - Abstract
Due to climatic hazards and extreme weather events, the pricing of property insurance products is increasingly attracting the attention of policyholders, insurance companies, and governments. Pricing based on market-oriented methods has to consider the affecting factors from policyholders’ perceived value. Pricing strategy design generates the need for natural hazards risk assessments. A natural hazard risk assessment is closely related to the human factors of a disaster-bearing body. In response to this need, we design an extension of the expected utility that is inconsistent with the additive expected utility, considering the human factors of policyholders, which is referred to as the expected loss utility (ELU). The ELU presents two improvements of the currently used utility. First, subjective probability, which is derived from individual predictions over acts, is applied to the ELU function to overcome the disadvantage that objective probability attaches to uncertainty does not reflect the uncertainty of human factors. Policyholders’ risk attitudes are reflected by the interpretation of interactions among uncertain events. Second, the hesitant fuzzy linguistic preference relation (HFLPR) is employed as the assessment of individual loss evaluation to reflect a policyholder’s hesitation. We apply the techniques of fuzzy linguistic term aggregation and perform a comparison to simplify our loss utility function. A detailed process of expected loss assessment is proposed due to variations in natural environment factors, local social characteristics, and disaster-bearing body factors. An illustrative example is given to perform a comparison with cumulative prospect theory to show the merits of the ELU. This study quantifies policyholder’s cognition of uncertain event and the cognition’s influence on risk assessment which can guide pricing strategy of property insurance products.
- Published
- 2021
- Full Text
- View/download PDF
92. Do Banks Provision for Bad Loans in Good Times? Empirical Evidence and Policy Implications
- Author
-
Cavallo, Michele, Majnoni, Giovanni, Levich, Richard M., editor, Majnoni, Giovanni, editor, and Reinhart, Carmen M., editor
- Published
- 2002
- Full Text
- View/download PDF
93. Credit Risk and Financial Instability
- Author
-
Herring, Richard J., Levich, Richard M., editor, Majnoni, Giovanni, editor, and Reinhart, Carmen M., editor
- Published
- 2002
- Full Text
- View/download PDF
94. Seasonality in catastrophe bonds and market‐implied catastrophe arrival frequencies
- Author
-
Markus Herrmann and Martin Hibbeln
- Subjects
alternative risk transfer ,Economics and Econometrics ,bond spreads ,seasonality ,Yield (finance) ,Bond ,catastrophe arrival frequencies ,Magnitude (mathematics) ,Secondary market ,Wirtschaftswissenschaften ,Seasonality ,medicine.disease ,Catastrophe bond ,Accounting ,Climatology ,ddc:330 ,medicine ,Environmental science ,Mercator School of Management - Fakultät für Betriebswirtschaftslehre ,underwriting risk ,alternative risk transfer -- bond spreads -- catastrophe arrival frequencies -- seasonality -- underwriting risk ,Expected loss ,Finance ,Active season - Abstract
We develop a conceptual framework to model the seasonality in the probability of catastrophe bonds being triggered. This seasonality causes strong seasonal fluctuations in spreads. For example, the spread on a hurricane bond is highest at the start of the hurricane season and declines as time goes by without a hurricane. The spread is lowest at the end of the hurricane season assuming the bond was not triggered, and then gradually increases as the next hurricane season approaches. The model also implies that the magnitude of the seasonality effect increases with the expected loss and the approaching maturity of the bond. The model is supported by an empirical analysis that indicates that up to 47% of market fluctuations in the yield spreads on single-peril hurricane bonds can be explained by seasonality. In addition, we provide a method to obtain market-implied distributions of arrival frequencies from secondary market spreads.
- Published
- 2021
- Full Text
- View/download PDF
95. Accounting for Financial Stability: Bank Disclosure and Loss Recognition in the Financial Crisis
- Author
-
Christian Laux, Christian Leuz, and Jannis Bischof
- Subjects
040101 forestry ,Economics and Econometrics ,050208 finance ,Financial stability ,business.industry ,Strategy and Management ,05 social sciences ,Accounting ,04 agricultural and veterinary sciences ,Incentive ,Loan ,Fair value ,0502 economics and business ,Financial crisis ,Capital requirement ,0401 agriculture, forestry, and fisheries ,Market expectations ,Business ,Expected loss ,Finance - Abstract
This paper examines banks’ disclosures and loss recognition in the financial crisis and identifies several core issues for the link between accounting and financial stability. Our analysis suggests that, going into the financial crisis, banks’ disclosures about relevant risk exposures were relatively sparse. Such disclosures came later after major concerns about banks’ exposures had arisen in markets. Similarly, the recognition of loan losses was relatively slow and delayed relative to prevailing market expectations. Among the possible explanations for this evidence, our analysis suggests that banks’ reporting incentives played a key role, which has important implications for bank supervision and the new expected loss model for loan accounting. We also provide evidence that shielding regulatory capital from accounting losses through prudential filters can dampen banks’ incentives for corrective actions. Overall, our analysis reveals several important challenges if accounting and financial reporting are to contribute to financial stability. Keywords: Banks, Financial crisis, Financial stability, Disclosure, Loan loss accounting, Expected credit losses, Incurred loss model, Prudential filter, Fair value accounting
- Published
- 2021
96. Meta-Optimization of Bias-Variance Trade-Off in Stochastic Model Learning
- Author
-
Takumi Aotani, Taisuke Kobayashi, and Kenji Sugimoto
- Subjects
Hyperparameter ,Mathematical optimization ,Optimization problem ,Meta-optimization ,Pareto optimization ,General Computer Science ,bias-variance trade-off ,Stochastic process ,Computer science ,Stochastic modelling ,General Engineering ,Variance (accounting) ,Machine learning algorithms ,TK1-9971 ,systems modeling ,Reinforcement learning ,General Materials Science ,Electrical engineering. Electronics. Nuclear engineering ,Expected loss - Abstract
Model-based reinforcement learning is expected to be a method that can safely acquire the optimal policy under real-world conditions by using a stochastic dynamics model for planning. Since the stochastic dynamics model of the real world is generally unknown, a method for learning from state transition data is necessary. However, model learning suffers from the problem of bias-variance trade-off. Conventional model learning can be formulated as a minimization problem of expected loss. Failure to consider higher-order statistics for loss would lead to fatal errors in long-term model prediction. Although various methods have been proposed to explicitly handle bias and variance, this paper first formulates a new loss function, especially for sequential training of the deep neural networks. To explicitly consider the bias-variance trade-off, a new multi-objective optimization problem with the augmented weighted Tchebycheff scalarization, is proposed. In this problem, the bias-variance trade-off can be balanced by adjusting a weight hyperparameter, although its optimal value is task-dependent and unknown. We additionally propose a general-purpose and efficient meta-optimization method for hyperparameter(s). According to the validation result on each epoch, the proposed meta-optimization can adjust the hyperparameter(s) towards the preferred solution simultaneously with model learning. In our case, the proposed meta-optimization enables the bias-variance trade-off to be balanced for maximizing the long-term prediction ability. Actually, the proposed method was applied to two simulation environments with uncertainty, and the numerical results showed that the well-balanced bias and variance of the stochastic model suitable for the long-term prediction can be achieved.
- Published
- 2021
- Full Text
- View/download PDF
97. Information Segmentation and Investing in Cybersecurity
- Author
-
Lei Zhou, Lawrence A. Gordon, and Martin P. Loeb
- Subjects
Set (abstract data type) ,Computer science ,Perspective (graphical) ,ComputingMilieux_LEGALASPECTSOFCOMPUTING ,Segmentation ,Computer security ,computer.software_genre ,Expected loss ,computer - Abstract
This paper provides an analysis of how the benefits of information segmentation can assist an organization to derive the appropriate amount to invest in cybersecurity from a cost-benefit perspective. An analytical model based on the framework of the Gordon-Loeb Model ([1]) is presented that provides a set of sufficient conditions for information segmentation to lower the total investments in cybersecurity and the expected loss from cybersecurity breaches. A numerical example illustrating the insights gained from the model is also presented.
- Published
- 2021
- Full Text
- View/download PDF
98. Integrating expected loss and collapse risk in performance-based seismic design of structures
- Author
-
Davit Shahnazaryan and Gerard J. O’Reilly
- Subjects
021110 strategic, defence & security studies ,Earthquake engineering ,Computer science ,Frame (networking) ,0211 other engineering and technologies ,Collapse (topology) ,02 engineering and technology ,Building and Construction ,Performance objective ,Geotechnical Engineering and Engineering Geology ,Reliability engineering ,Seismic analysis ,Consistency (database systems) ,Geophysics ,Seismic hazard ,Expected loss ,Civil and Structural Engineering - Abstract
With the introduction of performance-based earthquake engineering (PBEE), engineers have strived to relate building performance to different seismic hazard levels. Expected annual loss (EAL) and collapse safety described by mean annual frequency of collapse (MAFC) have become employed more frequently, but tend to be limited to seismic assessment rather than design. This article outlines an integrated performance-based seismic design (IPBSD) method that uses EAL and MAFC as design parameters. With these, as opposed to conventional intensity-based strength and/or drift requirements, IPBSD limits expected monetary losses and maintains a sufficient and quantifiable level of collapse safety in buildings. Through simple procedures, it directly identifies feasible structural solutions without the need for detailed calculations and numerical analysis. This article outlines its implementation alongside other contemporary risk-targeted and code-based approaches. Several case study reinforced concrete frame structures are evaluated using these approaches and the results appraised via verification analysis. The agreement and consistency of the design solutions and the intended targets are evaluated to demonstrate the suitability of each method. The proposed framework is viewed as a stepping stone for seismic design with advanced performance objectives in line with modern PBEE requirements.
- Published
- 2021
- Full Text
- View/download PDF
99. Economic Assessment of Permafrost Degradation Effects on the Housing Sector in the Russian Arctic
- Author
-
Boris Porfiriev, Dmitry A. Streletskiy, and D. O. Eliseev
- Subjects
Cultural Studies ,Natural resource economics ,05 social sciences ,Global warming ,050905 science studies ,010403 inorganic & nuclear chemistry ,Modernization theory ,Permafrost ,01 natural sciences ,0104 chemical sciences ,Permafrost degradation ,Arctic ,Economic assessment ,Political Science and International Relations ,Environmental science ,0509 other social sciences ,Expected loss ,Stock (geology) - Abstract
This article is devoted to the methodology and analysis of the results of economic assessment and forecasting of the consequences of global climate change in the form of permafrost thawing and degradation for the housing sector in eight regions of the Russian Arctic. Changes in the state of permafrost soils during the implementation of the most negative (scenario RCP 8.5) of the IPCC forecast options as the most appropriate to the conditions of the Russian Arctic were taken as a physio-geographic basis for the assessment. It is shown that, under a conservative scenario of the housing sector development in this macroregion of Russia in 2020–2050, the annual average cost of maintenance and restoration of the lost housing stock will exceed ₽30 bln. With the implementation of the modernization scenario, the cost above will increase to ₽36 bln. The maximum expected loss is predicted in the Yamalo-Nenets Autonomous Okrug and Krasnoyarsk krai, and the minimum, in the Chukotka and Khanty–Mansi autonomous okrugs.
- Published
- 2021
- Full Text
- View/download PDF
100. Equity in Automobile Insurance: Optional No-Fault
- Author
-
Powers, Michael R., Lascher, Edward L., Jr., editor, and Powers, Michael R., editor
- Published
- 2001
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.