Disappearing Risk

Jun
19
2013
Disappearing Risk
Stephen Mildenhall –

You can’t exclude your way to prosperity and sublimit your way to relevance. You have to serve the client and reclaim your space. Mike McGavick, CEO XL Capital, 2012 Monte Carlo Rendezvous

2012 saw the ninth year of softening property casualty pricing measured by net written premium as a proportion of US GDP. In 2010 the proportion moved below 3% for the first time since 1974, and it stayed below 3% in 2011 and 2012. In 2003, the peak of the last hard market, property casualty insurance premium represented 3.6% of the total economy; by 2012 the proportion had decreased to 2.9%, a reduction of 0.7 percentage points of the entire economy – a massive movement for the industry. And all without the impact of driverless cars!

At the same time, the industry reported a combined ratio of 103% in 2012 despite the effects of Super-storm Sandy. Without Sandy losses the industry would have skirted with another sub-100% combined ratio year. Indeed, through nine months the US statutory industry posted a combined of 100.9%.

Given historically low pricing and one of the costliest wind storms of all time, 103% is a surprisingly strong result – albeit representing a return well below the industry’s cost of capital in today’s low interest rate environment. How can we reconcile such a relatively strong underwriting result with historically low premium levels?

The graphs below show historical frequency indices for several major lines of business. Auto frequency is measured by the rate of fatal accidents per 100 million miles driven. It shows a decline of 60% since 1980, or a 3 point average annual decline. Workers compensation lost time frequency has declined steadily since 1990 by a total of 55%, or a 6 point average annual decline. Medical malpractice frequency has benefited from tort reforms in many states, and shows a decline in the rate of suits per 100 physicians of 50% since 1991, or 4 points per year. General liability shows a similar trend since 1991. Outside the liability sphere, we even see a decline in the number of fires: since 1977 the number of structure fires in the US has declined by more than 50%, or more than 2 points per year on average.

historical frequency trends

Frequency is obviously only half the loss cost story; severity is the other half. Severity trends have been broadly stable, with increases within a few points of the CPI or medical CPI rate for most lines over the last decade. Since many lines are rated on an inflation sensitive exposure basis, the resulting premium increases, combined with frequency declines, have been enough to offset severity increases and have sustained underwriting results through the soft market to a much greater degree than expected. In part this phenomenon is responsible for the favorable reserve development the industry has reported since 2003.

Combining all lines of business and comparing industry net incurred loss to GDP produces a very interesting picture (below). Loss to GDP was on a steadily rising track from 1970 to 1986, increasing from below 2% to above 3% over that period. 1987 saw a sea change in US insurance: a new tax law required discounting incurred losses, making cash flow underwriting much less attractive; the claims made form and absolute pollution exclusion were introduced; and, in casualty lines, we saw the beginning of a trend to limit coverage and exclude risk. Since 1987 loss as a percentage of GDP has decreased from 3% back to 2%. The consistent nature of the decline is masked by the clear impact of the reserving cycle as well as random year-to-year variability in catastrophe losses. The loss to GDP graph aggregates the impact of favorable frequency trends and shows that they have produced a material shift in insurance penetration into the economy. It shows the extent to which insurers, all eager to grow, are now fighting over a shrinking cake. The traditionally-insured part of the world is becoming safer: loss control programs, health warnings and risk awareness, drunk driving and graduated driving licenses are all reducing the premium and loss size of standard, lower limit liability policies.

CY incurred loss % GDP

Against this trend of the “disappearing small risk” we have seen other important trends in recent years.
The first has been a trend of increasing property losses driven by catastrophic events – such as Super-storm Sandy. Property premium globally has increased by 4.2% annually over the last five years, compared to 3.2% for motor and just 1.5% for liability, the two other major insurance sectors. In the US, without premium increases in property of around 15% since 2006, the total statutory premium in the industry would have shrunk by much more than the 2% drop actually observed.

Despite the increase in property exposure, global reinsurance capacity remains more than adequate to meet demand. Globally, the peak reinsured exposure remains US hurricane, followed by US earthquake, and then Euro and Japanese wind and earthquake exposures. The amount of risk transferred into the private reinsurance market is greatly reduced by a number of (implicit or explicit) government pools and public sector solutions, most notably the non-insurance of earthquake in the US and the Japan Earthquake Reinsurance pool. Many of these pools were created when reinsurance capacity was lower and not deemed sufficient to meet private market demand. But with today’s record levels of reinsurance capacity, now estimated by Aon Benfield Analytics to be in excess of $0.5 trillion, the opportunity exists to move substantial risk into the private insurance and reinsurance markets. Ballooning public sector debt is further reason why reliance on pre-funded reinsurance solutions is clearly a superior risk management solution.

Another important trend has been the emergence of corporate liability losses on a scale that would have been hard to imagine just five years ago. Companies in the oil drilling, power generation and pharmaceutical industries, as well as and numerous examples in financial services, have suffered actual or market capitalization losses of tens of billions of dollars or more over the last two years. Almost all of these losses have been uninsured. Many other potential exposures of a similar size exist in the market: railway liability from an explosion or derailment of a dangerous cargo in an urban area, the multi-channel accumulation of BPA through the bio-system, and breach of privacy related to social media or other cyber-crime, to name just a few. In fact, the insurance world is mirroring the financial and manufacturing worlds. Systems are becoming more reliable on average and less susceptible to minor disruptions, but have an increased fragility and exposure to major catastrophic losses. Just-in-time inventory systems and globally distributed manufacturing led to some surprising business interruption claims from the Tohoku earthquake, for example. And the contagions in the financial crisis of 2007-09 are well known.

Just as insurance capacity has been brought to bear to effectively transfer and manage property risks once thought too large, the industry today needs a similar expansion of capacity for casualty and non-traditional exposures to help drive meaningful growth over the next decade. In moving to self-insurance or no insurance, an important independent oversight function is lost: the loss control and process review of the risk taker. Such a service, backed by a risk indemnity, has been so effective for small risks that they are rapidly decreasing, as the graphs above show. To reverse the resulting long-term decline in the insurance sector, with decreasing loss to GDP, the industry needs to creatively and constructively grapple with large, harder to quantify risk.

What lessons can we draw from the successful transfer of mega-property risks to insurers and reinsurers for potential casualty capacity? Analytics and catastrophe models bring science and statistics to bear on the problem of estimating losses. Models provide a common currency for risk; a currency shared by risk assumers, regulators, rating agencies, and, crucially, capital providers, and they support market pricing that is transparent and predictable. The same computer modeling techniques used so successfully to manage property risk are now developing to the point where some mega-casualty perils could be understood well enough to support insurance solutions – provided there is adequate demand.

The global pooling provided by reinsurance (whether traditional or non-traditional capital markets) is key to creating capacity: reinsurance provides property capacity for a number of independent but highly volatile risk towers that together drive adequate, and so manageable, leverage of capital and so adequate returns on capital. Success in providing higher capacity to casualty and other non-property risks will not be based on one or two highly skewed covers, but on an adequate portfolio of such covers. Success will feed on itself, with more risk attracting more capital and driving more cost effective coverage. The first steps will come from an open attitude to risk from insurers, and, equally importantly, a greater appreciation for the value of the insurance product from a range of insureds. Agents and brokers must continue to work with insureds, insurers and the reinsurance markets to help facilitate a creative and responsible approach to risk innovation through new products, often with reinsurance support, to drive real insurance growth in the coming decade.

Stephen Mildenhall is the Global CEO of Analytics for Aon. Stephen is based in Singapore.

A previous version of this article was first published in the seventh edition of Aon Benfield’s Insurance Risk Study, 2012, which is available at www.aon.com.

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright © 2019 | Property Casualty Insider