90 résultats
pour « Quantification des risques »
This paper introduces an 𝗶𝗻𝗻𝗼𝘃𝗮𝘁𝗶𝘃𝗲 𝗵𝘆𝗯𝗿𝗶𝗱 𝗶𝗻𝘀𝘂𝗿𝗮𝗻𝗰𝗲 𝗺𝗼𝗱𝗲𝗹 designed to cover 𝗵𝗲𝗮𝘃𝘆-𝘁𝗮𝗶𝗹𝗲𝗱 𝗹𝗼𝘀𝘀𝗲𝘀, which are extreme and potentially limitless financial damages, often associated with natural disasters. 𝗧𝗵𝗲 𝗺𝗼𝗱𝗲𝗹 𝗰𝗼𝗺𝗯𝗶𝗻𝗲𝘀 𝘁𝗿𝗮𝗱𝗶𝘁𝗶𝗼𝗻𝗮𝗹 𝗶𝗻𝗱𝗲𝗺𝗻𝗶𝘁𝘆-𝗯𝗮𝘀𝗲𝗱 𝗶𝗻𝘀𝘂𝗿𝗮𝗻𝗰𝗲 𝗳𝗼𝗿 𝘀𝗺𝗮𝗹𝗹𝗲𝗿 𝗹𝗼𝘀𝘀𝗲𝘀 𝘄𝗶𝘁𝗵 𝗽𝗮𝗿𝗮𝗺𝗲𝘁𝗿𝗶𝗰 (𝗶𝗻𝗱𝗲𝘅-𝗯𝗮𝘀𝗲𝗱) 𝗶𝗻𝘀𝘂𝗿𝗮𝗻𝗰𝗲 𝗳𝗼𝗿 𝗹𝗮𝗿𝗴𝗲𝗿, 𝗰𝗮𝘁𝗮𝘀𝘁𝗿𝗼𝗽𝗵𝗶𝗰 𝗲𝘃𝗲𝗻𝘁𝘀. A key contribution is the development of a 𝘀𝗽𝗲𝗰𝗶𝗮𝗹𝗶𝘇𝗲𝗱 𝗼𝗽𝘁𝗶𝗺𝗶𝘇𝗮𝘁𝗶𝗼𝗻 𝗰𝗿𝗶𝘁𝗲𝗿𝗶𝗼𝗻 and a 𝘁𝘄𝗼-𝘀𝘁𝗲𝗽 𝗰𝗮𝗹𝗶𝗯𝗿𝗮𝘁𝗶𝗼𝗻 𝗺𝗲𝘁𝗵𝗼𝗱𝗼𝗹𝗼𝗴𝘆 that can leverage readily available covariate data, even when comprehensive loss data is scarce. Empirical analysis using both 𝘀𝗶𝗺𝘂𝗹𝗮𝘁𝗲𝗱 𝗮𝗻𝗱 𝗿𝗲𝗮𝗹-𝘄𝗼𝗿𝗹𝗱 𝘁𝗼𝗿𝗻𝗮𝗱𝗼 𝗱𝗮𝘁𝗮 demonstrates that 𝘁𝗵𝗶𝘀 𝗵𝘆𝗯𝗿𝗶𝗱 𝗰𝗼𝗻𝘁𝗿𝗮𝗰𝘁 𝗼𝘂𝘁𝗽𝗲𝗿𝗳𝗼𝗿𝗺𝘀 𝘁𝗿𝗮𝗱𝗶𝘁𝗶𝗼𝗻𝗮𝗹 𝗰𝗮𝗽𝗽𝗲𝗱 𝗶𝗻𝗱𝗲𝗺𝗻𝗶𝘁𝘆 𝗰𝗼𝗻𝘁𝗿𝗮𝗰𝘁𝘀 by providing better coverage for the same premium, especially benefiting regions with limited data. The authors highlight the practical advantages of 𝗳𝗮𝘀𝘁𝗲𝗿 𝗰𝗼𝗺𝗽𝗲𝗻𝘀𝗮𝘁𝗶𝗼𝗻 and 𝗿𝗲𝗱𝘂𝗰𝗲𝗱 𝗼𝗽𝗲𝗿𝗮𝘁𝗶𝗼𝗻𝗮𝗹 𝗰𝗼𝘀𝘁𝘀 offered by the parametric component.
This academic paper proposes these 𝗸𝗲𝘆 𝘁𝗮𝗸𝗲𝗮𝘄𝗮𝘆𝘀:
• The analysis provides a framework for introducing index insurance in competition with traditional products, emphasizing demand and solvency.
• Key drivers for index insurance demand are policyholder risk aversion, compensation speed advantage over traditional products, and its pricing (loading factor).
• The proposed hybrid product effectively balances the strengths of both insurance types by applying index insurance where it is “most suitable for policyholders,” accelerating compensation, and potentially reducing premiums.
• The methodology can help insurers identify specific loss types for which index compensation is preferred, optimizing portfolio structure and claims management.
• Future work will address modeling demand for index insurance in situations where traditional indemnity-based insurance is unavailable, requiring a “more nuanced approach to calibrate the utility function.”
𝗘𝗜𝗢𝗣𝗔 has issued new guidance on supervising 𝗺𝗮𝘀𝘀-𝗹𝗮𝗽𝘀𝗲 𝗿𝗲𝗶𝗻𝘀𝘂𝗿𝗮𝗻𝗰𝗲 and 𝗿𝗲𝗶𝗻𝘀𝘂𝗿𝗮𝗻𝗰𝗲 𝘁𝗲𝗿𝗺𝗶𝗻𝗮𝘁𝗶𝗼𝗻 clauses. This guidance, provided in two annexes to its 2021 Opinion on risk-mitigation techniques, aims to standardize supervisory approaches across Europe.
The first annex focuses on 𝗺𝗮𝘀𝘀-𝗹𝗮𝗽𝘀𝗲 𝗿𝗲𝗶𝗻𝘀𝘂𝗿𝗮𝗻𝗰𝗲, offering detailed guidance for supervisors on its prudential treatment. It emphasizes ensuring a common European approach, particularly in light of recent high lapse risks in various markets. The guidance helps supervisors evaluate how elements like the measurement period, exclusions, or termination clauses affect risk transfer effectiveness and the 𝗦𝗼𝗹𝘃𝗲𝗻𝗰𝘆 𝗖𝗮𝗽𝗶𝘁𝗮𝗹 𝗥𝗲𝗾𝘂𝗶𝗿𝗲𝗺𝗲𝗻𝘁 (𝗦𝗖𝗥). A 12-month measurement period is generally expected, aligning with the SCR time horizon.
The second annex addresses 𝘁𝗲𝗿𝗺𝗶𝗻𝗮𝘁𝗶𝗼𝗻 𝗰𝗹𝗮𝘂𝘀𝗲𝘀 𝗶𝗻 𝗿𝗲𝗶𝗻𝘀𝘂𝗿𝗮𝗻𝗰𝗲 agreements that could undermine effective risk transfer. It highlights provisions that release the reinsurer from responsibility for legitimate losses during the treaty period and scrutinizes contracts where reinsurers can unconditionally retain transferred premiums and assets upon termination while being freed from obligations. These annexes promote supervisory convergence and fair competition within the market.
This study develops a machine learning framework to identify high-risk enterprise financial reports, comparing Support Vector Machine, Random Forest, and K-Nearest Neighbors models. Using 2020–2025 audit data from the Big Four firms, Random Forest showed the highest performance (F1-score: 0.9012), excelling in detecting fraud and compliance issues. While KNN struggled with high-dimensional data, SVM performed well but was computationally intensive. The study highlights the potential of machine learning in auditing but notes limitations, including reliance on structured data and exclusion of external economic factors.
This paper presents a unified framework for reinsurance markets with multiple insurers and reinsurers, using Choquet risk measures and nonlinear pricing. It identifies Subgame Perfect Nash Equilibrium as the optimal concept, proving contracts are rational and Pareto optimal, with insurer welfare gains over monopoly scenarios.
As extreme weather events intensify, insurers face limits in absorbing losses, necessitating a shift from post-event compensation to loss prevention. This requires interlinked public, public-private, and private solutions, with tough policy decisions on responsibilities and cost allocation. Insurers can leverage risk expertise, data, and technology to promote loss prevention through knowledge-sharing and financing household measures, fostering a cycle of enhanced insurability, reduced protection gaps, and business growth. While insurance law traditionally supports compensation, tailored loss prevention clauses could become standard, addressing protection gaps and creating transformative opportunities. Prevention surpasses post-event claims and uninsured losses.
All strategic and operational decisions should consider risk-adjusted earnings value, as all management inherently involves risk management. Effective risk management requires skilled personnel and a robust system to analyze, monitor, and manage risks, focusing on seven key areas: decision-oriented risk management, value-oriented corporate management, risk quantification (including economic, geopolitical, and sustainability risks), and risk aggregation using Monte Carlo simulations. A strong corporate strategy ensures financial sustainability and manageable earnings risks, while embedded risk management enables employees to address risks. These areas, underexplored in literature, warrant further attention, particularly risk aggregation through simulation methods.
The Cyber Due Diligence Object Model (CDDOM) is a structured, extensible framework designed for SMEs to manage cybersecurity due diligence in digital supply chains. Aligned with regulations like NIS2, DORA, CRA, and GDPR, CDDOM enables continuous, automated, and traceable due diligence. It integrates descriptive schemas, role-specific messaging, and decision support to facilitate supplier onboarding, risk reassessment, and regulatory compliance. Validated in real-world scenarios, CDDOM supports automation, transparency, and interoperability, translating compliance and trust signals into machine-readable formats. It fosters resilient, decision-oriented cyber governance, addressing modern cybersecurity challenges outlined in recent research.
This study extends the Gordon–Loeb model for cybersecurity investment by incorporating a Hawkes process to model temporally clustered cyberattacks, reflecting real-world attack bursts. Formulated as a stochastic optimal control problem, it maximizes net benefits through adaptive investment policies that respond to attack arrivals. Numerical results show these dynamic strategies outperform static and Poisson-based models, which overlook clustering, especially in high-risk scenarios. The framework aids risk managers in tailoring responsive cybersecurity strategies. Future work includes empirical calibration, risk-averse loss modeling, cyber-insurance integration, and multivariate Hawkes processes for diverse attack types.
EIOPA's April 2025 Insurance Risk Dashboard indicates stable, medium-level risks in the European insurance sector, though pockets of vulnerability exist due to geopolitical uncertainty and market volatility. Macroeconomic risks are stable but with concerning GDP growth and inflation forecasts. Credit risks remained stable until early April, when spreads widened slightly. Market risks are elevated due to bond and equity volatility. Liquidity, solvency, profitability, financial interlinkages, and insurance risks are stable. Market sentiment is medium risk, and ESG risks are steady but with an intensifying outlook due to shifting environmental agreements.