How to define a variable conditional on a state of a markov chain

Issue #62 invalid
Former user created an issue

I want to define a variable with different distributions based on the current state of a Markov chain.

For example, I define a weather variable using Markov chain as

weather = markov.chain_from_matrix(('sunny','rainy'), ('sunny',(0.95, 0.05)), ('rainy',(0.8, 0.2)))

I want to have a different distribution of temperatures according to today's weather

Potentially, I want something like

temp_dist = weather.switch({'sunny': p_sunny, 'rainy': p_rainy})

But Lea will return an error

AttributeError: 'Chain' object has no attribute 'switch'

Comments (3)

  1. Pierre Denis repo owner

    A Markov chain is not a probability distribution, so switch method cannot be called. To get a probability distribution, you need to define an initial distribution like “yesterday was sunny, with x% chance” You can obtain easily a uniform initial distribution using .state attribute (see doc here)

    >>> yesterday_weather = weather.state
    >>> yesterday_weather
    sunny : 0.5
    rainy : 0.5
    

    From this initial state, the Markov chain can build a distribution for today:

    >>> today_weather = weather.state.next_state(1)
    

    (or next days, replacing 1 by 2, 3, etc.) Then, you may define temperature for today using a switch (i.e. conditional probability table). For instance:

    >>> p_sunny = 18 + Lea.binom(4,0.5)
    >>> p_rainy = 10 + Lea.binom(4,0.5)
    >>> today_temperature = today_weather.switch({'sunny': p_sunny, 'rainy': p_rainy})
    

    As a sanity check,

    >>> today_temperature.given(today_weather=='sunny')
    18 : 0.0625
    19 : 0.25
    20 : 0.375
    21 : 0.25
    22 : 0.0625 
    

    … that is the p_sunny temperature distribution, as expected.

  2. Log in to comment