the international society of CATASTROPHE managers |
The ISCM in London gathered on Friday 22 November 2019 in the Lloyd’s Old Library to discuss the challenges of understanding and managing liability cat risk. The first part of the event presented two perspectives on the topic, one from the regulator presented by Giorgis Hadzilacos, PRA and another personal perspective from Matt Harrison, Hiscox. The second part of the event saw a panel of liability modelling experts from across the market discuss the challenges in modelling liability risk, and some of the ways we can work to tackle these today.
Part I - The Regulator’s Perspective Giorgis Hadzilacos, PRA Giorgis started the presentation reminding us that both in terms of gross written premium and in terms of volume, casualty risk is by no means small. Likewise, there is correlation within both casualty classes of business, and with other classes of business which justifies paying it more attention. Aside from the size of the risk, one of the key challenges for understanding liability risk and reason it is a regulatory concern is that historical claims which are typically used for pricing and reserving models, are not necessarily good predictors of the future. There is also huge variability in the perception of the risks being underwritten with examples of some casualty writers dropping books of business due to unprofitability with other firms writing those same risks at a lower price and perceived higher rate adequacy than renewed business. Even with the risks that are being written today, the view of risk between underwriting, exposure management and capital is often different leading to inconsistent views of the risk within organisations. To add to these challenges, clash across lines of business which remain largely unassessed, and a lack of proactive reserving are some of the other reasons we don’t currently have the confidence that we are on top of casualty cat. There is however progression towards a more robust liability cat assessment environment and there are a range of methodologies used to help quantify these risks being put into practice in the market today. Different methodologies each have their own merits and Giorgis emphasised the importance of recognising exposure and applying different thinking to the approaches used within the property cat space. We also need to have some way of validating outputs from these methodologies and compare against them to give the industry some comfort. So how we achieve this? We need to keep striving for better data, better tools, better methodologies and most importantly share this knowledge. There’s a lot of space for non – competitive collaboration showcasing clash scenarios, emerging risk and stimulating discussions to help deal with the problem ideally well before it has become the ‘next asbestos’. Giorgis hopes that some of the work on the PRA’s agenda in 2020, will provide some indirect pressure which can accelerate learning and provide feedback to the community. Part 2 Why is Liability Cat Different? A personal perspective Matt Harrison, Hiscox Matt started off his presentation asking the critical question: what is exposure management for liability risk? Often people talk about exposure management through the lens of a cat model however you can’t manage a risk if we don’t know what it is and how it’s changing. He made the important point that if we can’t understand the risk yet, why are we even talking about a 1 in 100 year event. We need to focus on what we have and where it is. Whilst the uncertainty with casualty cats is huge, so is the uncertainty in nat cat. Matt asked the question if we had a view of when a hurricane would make landfall, weeks or months ahead of time, would we do anything differently to what we do today? Would we stop writing cat risk in Miami if we knew a hurricane was going to make landfall there next year? Whilst there is inherent uncertainty, we’ve got to get better as an industry at thinking about the potential forward risk. If we take for example the Opioids crisis, there were three waves of signals in the U.S. In 1999 there was little noise or signal and not enough information to not write this risk. Looking ahead to 2016, there was lot more noise and already by 2019 it’s too late and risks which are susceptible to claims are already on the book. At some point we’ve got to get better at recognising this and start making decisions when we see the signals vs when it’s too late. Similarly, we need to think about the peril in the timeframe of the policy. Whilst we have long policy periods, we need to think about mitigating it on a shorter scale. The key reasons casualty risk is so different to property risk is quite simply the risks that you measure won’t happen, the scenarios you create probably won’t happen and the historical events you use definitely won’t happen again. Similarly, one of the difficulties is we are used to as a community using outputs of models and understanding the exposure in a geospatial way. We need to focus on understanding the policy forms, wordings and focus on what it is and where it is to help understand and manage this complex landscape. Part 3 An Expert Industry Panel Moderator: Kirsten Mitchell – Wallace, Lloyd’s Robin Wilkinson, AIR Adhiraj Maitra, Willis Towers Watson Matt Harrison, Hiscox David Loughran, Praedicat The panel opened with what we can learn from property cat and what we can take across to ensure that liability exposure management benefits from where there has been progress elsewhere. The general agreement was the need for better exposure capture and better understanding the nature and size of the business that is being written. We need the data to run any model and as demonstrated with property cat, the better the data, the better the results. We also need to understand the potential footprint for a casualty cat event and what policies are going to be brought into the same event. However more importantly this all requires internal buy in from organisations, and we can certainly leverage the battles we have fought with nat cat models to help on this. The panel then discussed what makes liability exposure management so different to nat cat and what we need to watch out for. Whilst liability isn’t a new risk to the market, there was a consensus that the complex nature of risk influenced by social and technological change, and also the non-repeating nature of events make both estimating future events and validating any approach challenging. As the nature of the risk is changing, we have the issue of identifying the next emerging liability risk to focus on which also requires us to understand why some historical events didn’t materialise. In the liability world, everyone is asking what is the next asbestos but conversely in nat cat, people aren’t asking what’s the next Hurricane Andrew. The latency of the events themselves also present a challenge as the exposures occur over decades and litigation unfolds over decades, and the potential correlation between multiple lines of business is enormous. If we want to be effective in helping understand this risk, we need to understand that things that may happen in decades to come. Some of the challenges are also cultural, with a lack of synergy between teams in a company cited as huge barrier as well. Likewise, when you have 30 years of nat cat models in practice, people expect the resolution and accuracy of a nat cat model which simply isn’t possible. Given all of the above, what type of exposure management framework can be put in place? As we don’t know for certain what events will unfold, we don’t want to get too prescriptive and fixed to a specific scenario. Instead of prescriptive scenarios, we want these to be thematic which is representative of our current thinking. In doing so we can tease out the types of risks, and interrelationships between classes of business. We want to be able to turn around and say we were talking about the right things, even if we didn’t get the specific scenarios and numbers right. If we get too prescriptive we may end up missing important signals. The panel then moved to discuss the different approaches used by each of the modelling vendors to tackle the problem. Robin discussed how Arium’s solution tackles the proximity question by using the supply chain. Whatever the event is in the future, you can understand the supply chain for a product and the liability will be in that supply chain. The scenario-based model helps you analyse and quantify exposure to historical liability events and understand what types of future events could cause loss to a portfolio. Robin emphasised the need for models to be flexible and transparent, so you can stress test them and adjust parameters as needed. Adhiraj highlighted that a key objective of any model is to understand the underlying risk and exposure. Given nobody knows what the next asbestos is, understanding your exposure better prepares you for when events happen. Their approach replicates that of property cat, looking at historical events and considering the probability and severity of the event. The approach also considers segmentation of the market to aid identification of exposures which is an iterative process. David shared how the focus for Praedicat has been on identifying the underlying events within the scientific literature. In this respect the science is the risk and Praedicat are studying the effects that are already in commerce or causing damage. Praedicat’s approach reads the science in scale, picks out distinct events to track over time to produce a granular model of these events. A probabilistic model is then built on top of these distinct events. This approach focuses deeply on risk identification, and one of the key challenges faced in doing this is the scale of the science and linking the science to commercial activity and specific insureds. The panel then moved on to the key question in the title of seminar, what could a 1-in-100 liability cat looks like. There were mixed opinions about use of the term “1 in 100” for liability cats and commentary that this terminology could be more harmful than helpful. Most agreed it aids communication by providing a reference point and a lexicon that everyone else in nat cat can use, however the risk is over confidence in the numbers that we have by assigning specific return periods, and a suggestion of inevitability of these events. In liability there is no inevitability as it’s a human driven process. If we are to assign probabilities to events, they should be seen more as guide than a given and to aid communication over anything else. Considering all of the above, what can exposure management professionals do to help today? One focus should be on monitoring the risks and identifying where the vulnerabilities are rather than waiting until the end to take action. Given there are long timeframes for events to happen, educating clients on understanding the risk and how they can manage it can only be helpful. Similarly, it comes down to how you write the risk. If it’s perceived to be a big risk, it should be the main peril on the policy. The key isn’t to exclude the risk but to be affirmative on it and work collaboratively with underwriters to force their understanding of the risk. To conclude the seminar, the group gave their view on what additional skills and knowledge individuals need to develop in order to tackle the challenges discussed. There was agreement that modelling these types of events requires a lot of different expertise to understand coverages, legislation but also more importantly how things are used in products. As well as experts, we need to bridge the gap between exposure management and underwriting through better communication as we’ve seen with property and nat cat. There was also a suggestion that we need to move to a world where we’re not writing all perils policies with a list of exclusions for hazards you don’t understand but looking towards writing more granular policies with an understanding of this risk itself. Different models give different outputs for different purposes and it’s important not to be wedded to any one approach. Similarly, whilst a model may help, it doesn’t take away the responsibility of understanding the underlying risk and the policy wording. The good news is a lot of the required expertise exists within insurance companies today, it’s really about bringing this together. ISCM will continue to support cross-industry initiatives that support knowledge dissemination on this topic. Please get in contact with Alan Godfrey if you would like to contribute further in this space.
1 Comment
The London arm of the ISCM gathered on Monday, 23 September 2019 in the Lloyd’s Old Library to discuss the challenges and the opportunities presented to our community by climate change.
The afternoon was organised in two parts: first we heard the Chief Risk Officer’s perspective with a presentation by Vinay Mistry from Channel Managing Agency, followed by a panel of distinguished market scientists and experts moderated by Richard Dixon. Part I - The Chief Risk Officer’s Perspective Vinay Mistry (CRO Channel Managing Agency) The concurrence of the ISCM event with the UN forum on Climate Change (the one famously attended by Greta Thunberg) wasn’t lost on the minds of attendees and speakers, and Vinay made an ice breaker of it by mentioning that other climate change events were available. That opening line set the tone for an informative and thought-provoking presentation, delivered informally but sharply. The main argument presented by Vinay is that climate change presents us with a set of risks which we are called to manage but also a set of opportunities which, when identified, can be transformational for our industry and for us as practitioners. Vinay explained that the direct impact on top and bottom line is not the most immediate concern, in his view transition risk has the potential to be most impactful, especially when it comes to step changes. The challenge here is to correctly identify not the magnitude but the direction of change, as in the short-term global warming can produce extreme and unexpected localised effects. Climate change is however already opening new opportunities for our industry: the emergence of flood as a major catastrophic peril is one we are most familiar with but looking ahead marine vessels might soon be able to navigate the arctic route and energy business is already moving into renewables. We will need to deal with a new set of regulations, possibly leading to new ways of managing capital. Vinay concluded his speech with a forward looking and positive note. He identified three key ingredients to manage the risks and take advantage of the opportunities: communication, improved tools and the ability to reframe our mission to include social and environmental conscience alongside profit. The Q&A session highlighted the need for collaboration industry wide. Dickie Whitaker from OASIS gave a few examples of the type of forums already active that might facilitate collaboration. Part II - An Expert Industry Panel Moderator: Richard Dixon (CatInsight) Tom Philp (Manager, Science in the Science and Natural Perils function at AXA XL) Jessica Turner (Senior Vice President, Catastrophe Advisory at Guy Carpenter) Paul Wilson (Head of Non- Life Analytics at Securis) Ioana Dima-West (Executive Director, Head of Model Research and Evaluation at Willis Re) The panel discussion was focused around five key questions which gave each panellist the opportunity to expand on their point of view. The key questions discussed were as follows: Q1 - What are the impacts of Climate Change on Catastrophe Risk and does the science help us? The consensus among the panellists was that climate change impact can particularly affect so-called “secondary perils”. For instance, storm surge and rainfall flooding seem to be becoming more prominent in tropical cyclones. The challenge however is the science is not mature enough, we can in some cases be fairly confident of the direction of change, but we have little evidence to quantify it. Further considerations were made about the adequacy of the current catastrophe models, with regards to their use to model multi-year deals on an aggregate basis. But even when considering the typical one-year time horizon it is difficult to say whether our models correctly reflect the risk. We also discussed the need of cat-model vendors to quantify and communicate how much climate change is already included in their models. Vendors need to justify why climate change is or isn’t explicitly accounted for – e.g. a good reason why it isn’t would be that the direction of a change in a specific peril is uncertain but is considered by the scientific community to be within historical inter-annual variability. Statements like this are already made in some white papers. The conclusion was that science helps us when it is convergent. If multiple sources point in the same direction we can be more confident of the signal and it is more appropriate for us to implement new adjustments or new features, but care is needed. Interpreting this is a role for applied scientists within the industry. Q2 - Is it fair to say our response is driven by regulation? The consensus among panellists was that regulation certainly plays a role. In countries where this is not happening conversations about climate change are more difficult. However at least another two important factors influenced our agenda on climate change: the role of investors and the role of public entities. Within the ILS market for instance, end investors are increasingly questioning the impact of climate change on catastrophe risk and their continued confidence in the asset class requires a robust response to how climate change is considered and included in our modelling. Beyond the insurance market, government and public entities have been working with Cat Model vendor companies for many years, using and adjusting their models to answer questions on the financial impacts of climate change, relevant to their long term needs for resilience. It is arguable therefore the work of the PRA came after enough momentum had built in the industry already. Q3 - What can we realistically achieve when the vast majority of our work is based on a time horizon of one year? The panel acknowledged the question was addressing an important aspect, as climate change effects will inevitably be negligible over the course of one year. Two important considerations however followed. Firstly, the need was identified for the cumulative effects of climate change to be incorporated in our current models. When pricing and transacting business we might still want to quantify the impact of sea level rise on the potential for storm surge and the leading vendor models already do this. Secondly, longer time horizons are indeed present in our everyday work even if they might occupy less of our time. In the mid-term horizon, for instance, we want to consider possible changes in asset values and the ever present risk of stranded assets which are factors that influence exposures and vulnerability. In the long-term environmental social and corporate governance takes centre stage. The conclusion was a call to be proactive. We, as practitioners, should be building a dialogue with the board, in order to be the ones framing the narrative in the long term. Q4 - What would your ideal research dataset be? After agreeing data is plentiful and easy to reach, the panel went on to disagree on how to use it and for what. Several suggestions were made, each tailored to a slightly different use case and the goal of each panellist. The most significant can be summarised as follows: It would be useful to have a set of projections with a slightly shorter term, say 2030 instead of 2050; Reach a better understanding of changes in probability for extreme events, in the short as well as in the long term; Greater scrutiny on current models and better documentation to use academic studies more effectively. Communication to academia (or even to insurers) by vendor modellers as to what they have derived their “baseline” historical period for risk Work with academia to translate scientific output into cat-modelling ingestible ‘variables’: e.g. instead of “changes in precipitation”, provide information on “changes in flooding events” Q5 - Are we biased towards thinking things are getting worse? The panel agreed there are areas where things will get better and these need to be recognised. The challenge is always around communication, as many of these factors will sound counter-intuitive to our audience. A suggestion was made that uncertainty around climate change is a much bigger issue than we are used to. However, our industry is fundamentally built on uncertainty – thus for some stakeholders in the market, climate change may present commercial opportunities e.g. insuring renewables or development of new products. Concluding the afternoon was the remark our community has been used to thinking about climate change for decades now. Our experience makes us best placed to tackle the issue, and events like the present one will ensure we will also be best equipped. On September 19, ISCM members gathered in New York City for the organization’s annual educational event. The morning sessions revolved around the overarching theme of model completeness and kicked off with presentations on energy risks and claims handling. Attendees were educated about modeling energy infrastructure followed by assignment of benefits in the Florida market. Next up was a rating agency’s perspective, which highlighted the different viewpoints each rating agency has on catastrophe models. The morning was rounded out with an intriguing presentation on artificial intelligence and its problem domain. Over lunch, members were given an update on the ISCM/iCAS Credential program.
The afternoon was packed with a climate focused agenda, beginning with a state of the science followed by an informative session on the Actuaries Climate Index. Two model vendors shared insight on the opportunities, challenges, and confidence levels for different perils that comes with modeling climate change, and the sessions wrapped up after hearing from three different organizations in the climate consulting space, each bringing unique services to the market. Less than two weeks later, an additional seminar was held in Chicago with a similar agenda to New York’s event, albeit different speakers. Model completeness discussions at this event revolved around data quality, with similar presentations on claims handling and rating agencies’ viewpoints to that of New York. In the afternoon, the climate change presentations touched on the volatility and uncertainty that exists by peril. Chicago attendees also learned about the Actuaries Climate Index, followed by two additional model vendors briefing the audience on climate change from a flood perspective and a general modeling perspective. The afternoon sessions wrapped up with a fascinating presentation relating climate and supply chain. Both events closed out with a meet and greet session where presenters and attendees were provided with the opportunity to network over some food and drinks. Overall, a total of 95+ individuals from 36 organizations participated across both events. Some feedback received from attendees at both sessions is as follows: “The topics were thought provoking, engaging, and certainly very related to cat modeling.” “I found [the ISCM event] helpful for understanding the needs of the field.” “The quality of the … presentations was very good this year… The information was well worth being out of the office for a day.” “I thought that the day was filled with great presentations and a lot of useful information. The networking event after was also a perfect opportunity to meet and discuss with fellow individuals in the catastrophe industry.” The ISCM is looking forward to seeing another great turnout next year! This year’s September ISCM conference in Zurich was entitled “Climate change – a key driver in catastrophe risk management, or one factor amongst many?”. The topic of climate change is hardly new to the insurance industry and has ranked as a risk concern for years. But in spring 2019, when we settled for the conference theme, no one would have anticipated the tremendous interest and broad media coverage this topic would receive by autumn. Just on the day before our conference for instance, climate activist Greta Thunberg addressed the assembled world leaders at United Nations Climate Action Summit in New York.
It thus came as no surprise that our conference in Zurich met with overwhelming interest from ISCM members. In fact, for the first time ever we could not admit a few late-comers anymore, as we had reached full capacity of the conference room. This year’s hosting company was Axis Capital, which did a tremendous job in welcoming and accommodating the more than 80 participants as well as the speakers on its Zurich premises. The afternoon was kicked off with a welcome address by Steve Arora, CEO Axis Reinsurance, followed by a conference overview from Peter Zimmerli, Cat Risk Manager at Axis Capital and ISCM board member. The programme put together by the organization committee in Zurich aimed at providing a view of current and potential future impacts of climate change from various different angles. Professor Stefan Broennimann, Head of Climatology at the University of Berne and Co-Author of the 5th Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) in 2014, summarized the current scientific consensus view. He was followed by Jane Toothill, Director of JBA Risk Management, who showed an approach in climate impact quantification by using flood models and shared insights from the climate-related regulatory framework in the UK. After that, Annemarie Buettner and Mathias Graf from Zurich Insurance showed a web portal they developed, allowing corporate insurance clients to assess potential climate change impacts for their specific exposure footprint. After a coffee break, Michael Rueegger, Deputy Chief Underwriting Officer for Agriculture at SCOR, provided interesting insights into the challenged faced in this particular line of business, and by an underwriter more generally. Then Thierry Corti, Head of Sustainability Management at Swiss Re gave an example of how a company looks at climate change from a broader overall risk management perspective. As the last speaker, Stefan Gross, Specialist on Sustainable Finance at the Swiss regulator FINMA, shared his view on current considerations and possible developments on the regulatory side. After the individual presentations, Amaryllis Mouyiannou, Senior Cat Modelling Analyst at Axis Capital, invited the speakers to a panel session with strong audience interaction. After a few closing remarks, many participants stayed around for the snacks & drinks reception offered in the entry hall of the Axis building. A great way to catch up with colleagues and an opportunity to continue discussions around how climate change may impact cat managers daily tasks and our industry more broadly. A great “Thank you” goes to our speakers and their commitment to support our event. Equally, we thank this year’s host Axis Capital for offering us the use of their premises, impeccably mastering event organization logistics and providing catering services. And last but certainly not least: the event would not have been possible without a truly fantastic ISCM Zurich organisation team. The International Society of Catastrophe Managers (ISCM) was established in March 2006. One of its aims is to promote and develop catastrophe and exposure management within the insurance industry. The London arm of the ISCM gathered on Friday, 28 June 2019 in the Lloyd’s Old Library to discuss the challenges presented to our community with regards to managing and modelling cyber underwriting risk.
The afternoon was organised in two parts: first we heard an underwriter's perspective on the challenges of underwriting cyber, by Helen Gemmell from Brit Insurance, followed by a panel of distinguished market experts moderated by David Clouston from Pelagius. Part I - The Underwriter's Perspective Helen Gemmell (Senior Underwriter, Cyber, Brit Insurance) Having worked previously as a catastrophe exposure manager Helen was well-placed to address the audience and offered comments on cyber coverages and market trends as well as her views on cyber models. The learning points from this session can be grouped into three categories: market trends, data issues and exposure management challenges. Market trends Coverage is offered for both first party and third party in the event of malicious and non-malicious attacks - the latter are accidental losses or system errors which can be costly and difficult to identify Typically, insurance would pay for all the legal and regulatory requirements (including fines), the extortion costs and any business interruption. Should the need arise, it would also pay for IT forensics and legal defences. The ever-evolving nature of the risk and the current competitive market, however, is leading to broader coverage. For example, damage to hardware (including single devices, a practice called 'bricking') which used to be excluded, has become commonly offered since NotPetya. Another example is the current pressure on a standard infrastructure exclusion, which applies to all losses arising from electrical failure, water or gas leakage. Removing, or even just narrowing, this type of exclusion poses significant challenges for the industry, including pricing and aggregations across lines of business. Further, undermining the industry’s ability to build robust models is the inconsistency with contract wordings. Multiple syndicates and broking firms are using their own wordings - each with different endorsements, which makes it very difficult to capture the many variations with the terms and conditions offered. Data issues Data is largely incomplete. Some proposal forms are missing basic information like data volumes, patching policies or disaster recovery plans. And even when we do have data we need to question it. A client might have a business continuity plan, but this plan might itself rely on data stored on a specific device (like a laptop) which may not be accessible during a cyber event. Quite often these flaws, which seem obvious, are not picked up until a breach actually happens (e.g. zero day vulnerabilities), making it easier, ironically, to rate a company with a history of cyber incidents than one without. As industry practitioners, we need to define the minimum amount of information required to model cyber. This threshold might vary according to the size of the company. Data requirements should also vary depending on the aspect of the risk that we are modelling. Lately ransomware attacks have been the largest driver of claims, while previously it was data breaches. Ransomware attacks exploit software vulnerabilities, therefore understanding software aggregations becomes particularly important, e.g. understanding how much a company relies on Microsoft with respect to Linux or others is critical data that is not commonly available. Exposure management challenges In summary: the risk is evolving, coverage is complex, wordings are inconsistent, and clients are unpredictable. Against this backdrop building a robust modelling framework looks unrealistic, but models provide guidance for underwriting and can be effective at portfolio accumulation management. We need to do more to understand and communicate uncertainty, and underwriters have a responsibility to educate exposure managers on wordings, coverages and similar. In many cases, deterministic modelling does not need to be very technical, as a simple scenario can give rise to an appropriate stress test . When looking at tail risk, that is low frequency and high severity events, probabilistic models can be complemented by scenarios and every other tool available. Clear risk management processes that are regularly reported are essential to assessing the risk as an underwriter. Conclusion The market is still growing significantly, both in terms of new clients seeking new policies and in terms of existing clients buying significantly more limit. However, there is a time lag. Given the complexities of the risk it might take a year to complete the proposal form and another year to underwrite, adding negotiations and sign off it can take up to three years to set up a new policy. Rates are still largely driven by market forces but the race to the bottom has recently shown signs of stabilisation, and loss ratios are still attractive despite growing steadily. Despite all that, some insurance companies are bound to be caught unprepared by a systematic event such as a cloud service provider failing for a sustained period of time. These events hold the potential for significant market aggregation. Part II - An Expert Industry Panel Moderator: David Clouston (Pelagius) Mark Christensen (Head of Catastrophe Management @ Chubb Overseas General) David Singh (Head of Exposure and Portfolio Management @ MS Amlin) Domenico del Re (Director @ PWC) Justyna Pikinska (Cyber Pricing Actuary @ Capsicum Re) Jamie Pocock (Cyber Analytics @ Guy Carpenter) The panel discussion opened with the question of human agency, and whether this specific feature of the risk allows the development of catastrophe models in the same way as natural perils do. There was some consensus on the need to learn to walk before we can run, in that cyber risk is evolving very quickly and it is still emerging (we have not seen the worst of it – will we ever?). Against this backdrop both actuarial techniques (which learn from the past) and catastrophe modelling techniques (which rely on accepted studies of the risk) have material limitations. There are a few things we can learn from past experience in cyber: recent events result from human actions or operational technology failure, regulation is tightening and fines are becoming more severe . At the same time, we need to be conscious of the ever-evolving nature of cyber risk: change affects behaviours and the nature of claims. Although a few years ago the main cause of claims was loss of data, more recently it has become ransomware. Following the attack on the city of Baltimore, where the ransom was not paid, governments of cities under attack chose to pay, creating the conditions for hackers to follow with more attacks. This has clearly increased the frequency of events, which in turn can lead to a spike in annual severity. But dealing with human agency is not new for our community, we see this when modelling flood for instance, and dealing with an ever-evolving peril is also not uncharted territory. For example, our hurricane models are getting better at capturing the storm surge element which in turn is becoming more severe because of climate change. Exposure managers need to think about the assets first, whether tangible or intangible, and develop ways to assess possible damage, in other words going back to basics before trying to push the envelope with probabilistic models. And that is because the potential for very severe losses is real, even more so considering insurance penetration is growing. In 2018 the global economic losses for cyber were three times the size of natural catastrophe losses. On the topic of human agency and anthropomorphic risk we heard from Stephen Burr of Pool Re, in the audience. He described the process for issuing the first ILS product backed by an in-house developed model. The main feature of the Pool Re model is that it is credible, and such reliability has allowed them to have a conversation with ILS partners. The frequency part was the most challenging to look at, requiring a partnership with academic institutions. From terrorism the conversation veered towards State-sponsored attacks. There was a general question about whether the models differentiate between State-sponsored or “lone wolf” attacks, but more generally the panel agreed there is uncertainty around attribution and considerations would have to be made around targeted attacks versus collateral damage. What about non-malicious losses? A coding error or other omission can cause serious damage, and policies should cover these events as well. These situations are not financially motivated and might as such be more difficult to capture within a catastrophe modelling framework. But even admitting that we are not able to capture all the nuances of the peril, the panel agreed that models have been and remain a very useful "measuring stick" to help understanding the relative risk. The question of data came up next, with the observation there is no consistency in the underwriting community about rating factors. This poses a challenge in that there is no consensus about which aspects of the peril are more or less risky. It is clear that we need data standards, and maybe some level of education, to develop standard ILF curves or other rating tools. The panel went as far as suggesting there could be a market-wide initiative, an open modelling framework to agree on a common language around cyber. There is, however, an immediate need to meet regulatory requirements and quantify cyber risk accurately for solvency calculation purposes, and at the moment each regulated entity is defining their own way of doing this. There are several approaches in use, from deterministic scenarios to frequency / severity modelling. Such variety is evidence that the market is making an effort to understand and quantify the risk, and the panel expressed optimism that these efforts were converging towards a common understanding. The discussion concluded with an analogy with the evolution of catastrophe models. Considering catastrophe modelling as a discipline was born in the mid-1980s, each panellist was asked to determine where cyber models are today. The consensus was that we are around the mid-1990s, but on also suggested that the evolution of cyber models is going to be a lot quicker than natural hazards models, with one year in the cyber space equivalent to maybe 7 years in the natural hazards space. At this pace, it will be only a few years before cyber catastrophe models could be as developed as current natural hazards ones – and rightly so. End-of-year renewals behind us, January seemed the perfect time for a casual networking get-together of the Zurich ISCM crowd. This time we met in a location along the local Limmat river. But obviously, it was much too cold to indulge in pretty views outside, we stayed firmly inside and huddled together around the bar. A few (unanticipated) company events took place the same evening, thus the turn-out was somewhat lower as in prior years. Nevertheless, as ISCM members greeted old colleagues and made new acquaintances, the place soon teemed with lively discussions. The big topics this year…? Two ISCM members came straight from their very last working day, after their company had restructured (away) the cat unit. So clearly, the question of what all the recent M&A activity and continuous financial industry cost-cutting would mean for the cat risk management community in Zurich was high on the discussion agenda. Of course, renewals observations and the cat market developments in general were on people’s mind as well. In summary, the annual networking event once again proved a good opportunity to bring the ISCM community in Zurich together and bring alive the value of our membership. A small crowd of devoted members stayed all the way to the end, when we got into a disappointing argument with the bar owner over the terms of the agreement we made, and thus over the final bill. Well, no matter who was right or wrong, we all swore that we’d never set foot into that bar again!
Special thanks for organizing this event go to Amaryllis and Lysiane, as well as to Annemarie (by now our unofficial event photographer…). Advancing our understanding of the impacts of hurricane clustering in the North Atlantic12/31/2018 The ISCM along, with the Institute and Faculty of Actuaries (IFoA) and Lloyd’s of London, recently hosted a panel discussion on the impact of hurricane clustering on the (re)insurance industry. A packed Lloyd’s Library listened intently as industry representatives, academia, researchers and catastrophe modeling company experts brought the audience up to date on the latest thinking on this important and topical subject. The panellists, Dimitris Papachristou from the Prudential Regulation Authority (PRA) of the Bank of England, Junaid Seria (SCOR), Nick Barter (Chaucer), Susanne Kolwinksi-Ward (AIR), Ivan Kuhnel (CoreLogic) and Steve Jewson (RMS), with frequent interaction from an engaged audience, discussed and debated for two hours if anything has changed in our understanding of the impacts of clustering, in light of the events of 2017 and 2018.
The topic for the session, “Hurricane Clustering in the North Atlantic: A Discussion”, was introduced and expertly moderated by Emma Watkins, from Lloyd’s. The panellists were given a number of questions to consider including:
Each of the panellists approached the questions from a different angle, as one would expect given their different roles in the catastrophe management world: some focussed on hazard, some on loss impacts while others on whether the hazard presented by clustering is changing in a significant way. This enhanced the quality of the presentations, allowing the presenters to go “deep” on their questions, encouraging further discussions by panellists and audience alike, adding to the overall experience! The research conducted by the PRA indicates that the frequency of hurricanes in the past two years does represent a significant shift from clustering we would expect to see in the basin over time. AIR also indicated that the change in hazard was indeed statistically significant, based on their research, on a U.S. country-wide basis, for major (category 3-5 on the Saffir-Simpson scale) hurricanes. RMS stated that, considering wind only, also on a U.S.-wide basis, the impact of clustering on loss was not material, and was not needed for consideration of reinstatements. There was a lot of discussion of the application of clustering in an insurance industry context. All the modelers implement clustering, allowing it to flow through to their financial models – AIR and CoreLogic use the negative binomial distribution to represent clustering in their models; RMS uses the poisson distribution for the non-clustered view, and a combination of negative binomials for the clustered view. Data availability and reliability was a common theme throughout the session – all agreed we simply need more data, with the resulting uncertainty not having changed much since clustering was first seriously considered in 2004 and 2005. The industry representatives, from SCOR and Chaucer, both stated that collaboration between researchers, modelers and business was critical to progress our understanding. Further research is also needed to understand whether any changes in clustering are “real” and would have a material impact on losses in the future. The session also featured an update on the partnership between the ISCM and iCAS (Casualty Actuarial Society) to develop a joint program of credentialization for people working in the catastrophe management industry. As we wrapped the discussion up in the Library, a group of attendees adjourned to continue the discussion at One Under Lime, outside on a chilly and blustery night in London. All agreed that it was extremely valuable to be brought up to date with the latest perspectives on clustering, and that we had not heard the last of the clustering debate! Academia should learn more about what the needs of the insurance/re-insurance industry are so that we can do a better job serving you. I look forward to other opportunities to learn more and maybe also show some of what we have been working on that could be of interest. If offered again next year, I will definitely plan to attend it again - Paolo Gardoni, Professor - University of Illinois at Urbana-Champaign Thanks again for organizing the conference last month. Overall, I thought it was very good. Dr. Gensini’s presentation was the highlight, and I really liked the approach of the morning presentations breaking the models into the parts of hazard vs. vulnerability vs. financial modeling. I thought all of those were good presentations as well. The presentation on the ISCM/iCAS credential generated a fair amount of discussion between me and my two staff members that attended the conference, so that update was useful. Well done on the topics that were chosen and I hope you consider organizing another Chicago event next year! - Derek Berget, ERM Modeling Director - American Family Insurance I am glad to hear the ISCM is considering Chicago again. I thought all sessions were good, especially the keynote by Dr. Gensini. I also enjoyed having the round table discussion and wish it could have been longer – There really are some big topics we need to discuss about the changes of the Insurance Industry and in CAT modeling. I would like to see a more detailed presentation on what others do in pre-event and post-event projections and reserving. I am looking for new ways we can increase our effectiveness with the tools at hand. - Zach Antle, Senior Reinsurance Catastrophe Risk Analyst - EMC Insurance Companies I wanted to tell you all how impressed I was with the NYC educational seminar, the venue, and the people. Over 40 years of attending conferences to date the RAA cat school is in my mind the best and as such the benchmark to which all others should be compared. My favorite RAA seminars were the very early ones because the size was more intimate. The smaller numbers gave everyone a chance to interact with each other as well as the presenters. The early events also lacked the pressures of client meetings, other distractions, and typically people learned more. This event brought back memories of what the RAA Conference first set out to be. Even as an old man I found each session to be more open, educational, interactive than any other seminar that I can recall. This is not to distract from the RAA cat school which is on another level with a different and more robust agenda, it is just to say how impressive this event was. Personally I like the intimate settings better than the gigantic settings since it feels more comfortable to engage. I would encourage the ISCM to do more of these and see your niche as one that no one else occupies! You really taught others! - Andy Castaldi, ISCM Past President I very much enjoyed the NY Education Event last month. Thank you to the ISCM for the work you do for seminars like this. With me being new in the field, it is great to hear from people with such expertise and experience. I can look to them and see all possibilities in Cat Modeling. Yet it also is great that I get to meet people around my age, in similar points of their careers, and share experiences. I thought the seminar covered a variety of topics. My favorite part was the roundtable session. It was informative to have an open session with a back-and-forth style conversation. I think it would be beneficial to have more sessions like that. Specifically, an information-type session, where people in the earlier part of their career are able to talk with people who have been in the field for years. We would be able to ask questions related to career advice, how to become more involved and the future of the field. - Megan Royek, Reinsurance Analyst - The Toa Reinsurance Company of America The quality of the speakers and their presentations were very good in the sessions I attended. the presenters effectively broke down complex topics into ways a newbie could easily digest. I liked that I saw students in attendance. When I attended my first seminar in 2016, I believe I was the only student. Everyone that I met there was already employed. This is a great networking opportunity and was happy to see more students there. Attending these seminars are beneficial. Overall, I think the experience was totally worthwhile. I look forward to attending future events and learning more about CAT Modeling and Re/Insurance. I’m especially interested in the ISCM / iCAS designation that will be coming in the near future! I’ll be taking advantage of that opportunity ASAP. - Albert Betancourt - American Family Insurance This year's ISCM conference was all about tech innovation and how it can be applied to catastrophe management. Hosted by SwissRe in the brand new SwissReNext building, the conference had the futuristic setting that fit the occasion.
Peter Zimmerli, the head behind ISCM Switzerland, gave a warm welcome speech to the about 60 Catastrophe Specialists of various companies. The first topic of this afternoon was “Blockchain” and its (potential) applications in catastrophe insurance. Tobias Noack (Etherisc) presented examples where the technology is already used, e.g. in a flight delay insurance product. Scott Beckermayer (Allianz) explained the ecosystem that a blockchain can build and how this could make captive insurance administration more efficient. Michael Stahel (LGT Capital Partners) introduced his ideas on how the whole insurance value chain could be simplified from the client to the reinsurer. The second part was dedicated to “Data Analytics and Machine Learning”. Jason Futers (Insuredata) kicked off the topic by introducing his company’s exposure enhancement capabilities. David Fox (Geospatial Insights) presented possibilities around post-disaster event response using drones or satellites. Grazia Frontoso (Google) showed interesting research around machine learning on damaged car images. As last presenter of the afternoon, Loris Foresti from MeteoSwiss demonstrated machine learning approaches used for example in precipitation forecasting. In panel discussions following each of these blocks the Zurich ISCM members actively participated with their questions, doubts or ideas around the topics presented. Zurich showed itself from its best side when the last item on the agenda was ticked off: a marvellous sun set provided the backdrop as the 2018 ISCM conference came to an end with the sound of clicking glasses on the roof terrace of SwissReNext. A great “Thank you” goes to our speakers and their commitment to support our event. Furthermore, we received greatly valued support in event organisation, catering and panel moderation from this year’s host Swiss Re. And last but certainly not least: the event would not have been possible without a truly fantastic ISCM Zurich organisation team. On 31st May over 100 cat modelling / exposure management professionals came together for a networking event in London. The venue was the Slug and Lettuce on St Mary Axe, opposite the world famous ‘Gherkin’ at the heart of the global insurance industry.
Amongst those in attendance were personnel from underwriters, brokers, reinsurers, vendor modellers, investment funds, regulators and the Lloyd’s market. Those present ranged from experienced individuals, to new market entrants enjoying their first opportunity to get to know colleagues in the wider insurance market. Brian Owens - a current ISCM Board Member from RMS. was present to join in the discussion. Discussion topics were diverse, from conference debriefs to the upcoming Hurricane season, new cat models releases, emerging risks and even the return period of an England victory at the 2018 football World Cup (history suggests a 1/20, with tournaments held every 4 years we could see an England victory this century). The event was sponsored by the ISCM alongside Ariel Re, Neon UW, Sirius IMA and Lloyd’s, and it was made possible by the organisational efforts of Giovanni Maccioni, David Ryan and Natalie Van Eck. Everyone agreed that an event of this type was long overdue in London, arguably the city with the largest community of catastrophe modellers in the world. We intend not to wait so long for the next get together, with plans underway to hold another informal event in London at the end of September. - N. Van Eck / D Ryan / G. Maccioni |
Randall LawSample Blog Archives
January 2020
Categories |
Proudly powered by Weebly