Bahnemann Flashcards Preview

exam 8 - B > Bahnemann > Flashcards

Flashcards in Bahnemann Deck (33)
Loading flashcards...
1
Q

Methods for Estimating Distribution Parameters

A

method of moments

maximum likelihood

minimum chi squared

minimum distance

2
Q

truncation

A

discarding; usually in case of claims below a deductible

3
Q

censoring

A

capping; usually in case of limit

4
Q

shifting

A

usually with straight deductible; for claims larger than deductible, they get reduced by the deductible amount

5
Q

since limits reduce volatility of severity compared to unlimited data

A
  • may be interesting in computing the variability of losses in layer
  • can use the coefficient of variation as a way to measure the variability for different distributions
6
Q

claim contagion parameter

A

accounts for claim counts not being independent of each other (where 1 claim encourages others to file a claim too)

-if claim counts have a Poisson, γ=0

7
Q

final rate for a policy needs to incorporate

A

all other expenses and profit as well as charge for risk

8
Q

risk charge

A

premium amount used to cover contingencies such as:

  1. random deviations of losses from expected values (process risk)
  2. uncertainty in selection of parameters describing the loss process (parameter risk)
9
Q

for purpose of pricing, instead of publishing full rates for every limit

A

insurers usually use relativities called ILFs to rate for a basic limit

10
Q

ILFs can be determined using

A

empirical data directly or can be obtained using a theoretical curve fit to empirical data with latter approach being more common for highest limits with little empirical loss data

11
Q

in determining ILFs appropriate for each limit, following assumptions are commonly made:

A
  1. all UW expenses and profit are variable and don’t vary by limit
    - in practice, profit loads might be higher for higher limits since they are more volatile
  2. frequency and severity are independent
  3. frequency is same for all limits
12
Q

ILFs must be (with Bahnemann’s assumption of fx(l) is not equal to 0 )

A

increasing at a decreasing rate

13
Q

in terms of premium, premium for successive layers of coverage of constant width

A

will be decreasing

14
Q

checking that a set of ILFs satisfies above criteria for I’(l) and I’’(l)

A

performing a consistency test:

Per occurrence limit l

Increased limit factor I(l)

Marginal rate per $1k coverage I’()

15
Q

an exception to consistency test

A

could occur if one of the ILF assumptions was violate like if liability lawsuits were influenced by size of limit, then frequency would not be same for all limits so formulas would not hold

16
Q

one reason that a set of increased limits factors may fail this consistency test yet still generate actuarially reasonable prices.

A

Adverse selection, which could happen if insureds that expect higher loss potential are more inclined to buy higher limits.

Adverse selection, which could happen if liability lawsuits are influenced by the size of the limit.

17
Q

how the consistency test has both a mathematical interpretation and a practical meaning

A

The practical interpretation is that as the limit increases, there are less losses expected at higher layers, so rates should not increase more for higher limits than for lower limits.

mathematical interpretation is that I’(l) ≥ 0 and I’‘(l) ≤ 0

18
Q

there is more volatility (process risk) for policies with

A

with higher limits or higher attachment points, insurers will also want to charge a risk load for these policies

-to do this, need to include a risk charge ρ(l)

19
Q

risk charge ρ(l) options

A

old Miccolis aka variance method

old ISO aka std dev method

20
Q

risk load increases as

A

policy limit increases and is used to take into account the higher process risk for policies with higher limits

21
Q

deductibles typically reduce

A

coverage limit so layer of coverage with deductible d and limit l is (d,l] and not (d,d+l]

22
Q

3 types of deductibles:

A

straight

franchise

diminishing

23
Q

straight deductible

A

loss is truncated and shifted by d such that net losses

Xd = X-d for d

24
Q

franchise deductible

A

loss is truncated but not shifted by d such that

Xd=X for d

25
Q

diminishing deductible

A
  • aka disappearing deductible
  • loss below amount d is fully eliminated, deductible declines linearly from d to another larger D and loss above D is paid in full

Xd,D = D/(D-d) * (X-d) for d <d></d>

<p> = X for D

</p>

<p>-formula for LER even assuming alae is not additive is long, so calculate the loss eliminated at each size of loss level as a % of total losses</p>

</d>

26
Q

when comparing the 3 deductibles

A

straight eliminates the most loss, then diminishing and then franchise

27
Q

when pricing a layer of coverage and applying a risk load, it is not appropriate to

A

simply subtract risk loaded ILFs or premiums for layer since that would result in an incorrect risk load

while without risk loading we have

Pa,l=Pl-Pa=Pb*[I(a+l)-I(a)]

-these relationships do not hold when risk loading is applied

28
Q

Assuming retention R describe how inflation will affect the expected losses for the excess cover relative to unlimited ground up losses.

A

Inflation would affect excess losses more than total losses. Some losses that were below the retention will now get into the excess layer. Also, for losses that were already in the excess layer, the increase in losses due to inflation will affect the excess layer entirely.

29
Q

reason why it is desirable for a set of increased limits factors to pass this consistency test.

A

So long as partial losses are possible, expected losses will not increase as much as the increase in limits, so the rate per $1k of coverage should decrease as the limit increases.

30
Q

diagram for expected losses

A

y=loss size

x = FX(x) = cumulative loss distribution

31
Q

why the loss cost for a given straight deductible policy can increase more than the ground-up severity trend.

A

For losses above the deductible, the trend is entirely in the excess layer.

Also, losses just under the deductible are pushed into the excess layer by the trend, creating new losses for the excess layer.

32
Q

How do the policy conditions alter the coefficient of variation of the claim-size variable

A

The policy restrictions restrict the variability of claims, such that the coefficient of variation will be lower with the policy restrictions than the coefficient of variation for the unlimited claim size variable

33
Q

Briefly describe one reason why increased limit factors might fail a consistency test and still produce reasonable rates

A

If adverse selection is occurring because higher risk insureds are more likely to purchase higher limits, then it would be reasonable to have factors that fail the consistency test (e.g., ILFs that increase at an increasing rate).