Help-Site Computer Manuals
Software
Hardware
Programming
Networking
  Algorithms & Data Structures   Programming Languages   Revision Control
  Protocols
  Cameras   Computers   Displays   Keyboards & Mice   Motherboards   Networking   Printers & Scanners   Storage
  Windows   Linux & Unix   Mac

Decision::Markov
Markov models for decision analysis

Decision::Markov - Markov models for decision analysis


NAME

Decision::Markov - Markov models for decision analysis


SYNOPSIS


  use Decision::Markov;

  $model = new Decision::Markov;

  $state = $model->AddState("Name",$utility);

  $error = $model->AddPath($state1,$state2,$probability);

  $error = $model->Check

  $model->Reset([$starting_state,[$number_of_patients]]);

  $error = $model->StartingState($starting_state[,$number_of_patients]);

  $model->DiscountRate($rate);

  ($utility,$cycles) = $model->EvalMC();

  $state = $model->EvalMCStep($cycle);

  ($utility,$cycles) = $model->EvalCoh();

  $patients_left = $model->EvalCohStep($cycle);

  $model->PrintCycle($FH,$cycle);

  $model->PrintMatrix($FH);


DESCRIPTION

This module provides functions used to built and evaluate Markov models for use in decision analysis. A Markov model consists of a set of states, each with an associated utility, and links between states representing the probability of moving from one node to the next. Nodes typically include links to themselves. Utilities and probabilities may be fixed or may be functions of the time in cycles since the model began running.


METHODS

new
Create a new Markov model.

AddState
Add a state to the model. The arguments are a string describing the state and the utility of the state. The utility may be specified either as a number or as a reference to a subroutine that returns the utility. The subroutine will be passed the current cycle number as an argument. Returns the new state, which is an object of class Decision::Markov::State.

AddPath
Adds a path between two states. The arguments are the source state, the destination state, and the probability of transition.

Probability may be specified either as a number or as a reference to a subroutine that returns the probability. The subroutine will be passed the current cycle number as an argument.

AddPath returns undef if successful, error message otherwise.

Check
Checks all states in the model to include that the probabilities of the paths from each state sum to 1. Returns undef if the model checks out, error message otherwise.

Reset
Resets the model. Use before evaluating the model.

StartingState
Sets the state in which patients start when the model is evaluated. The optional second argument sets the number of patients in a cohort when performing a cohort simulation.

Returns undef if successful or an error message.

DiscountRate
Sets the per-cycle discount rate for utility. By default, there is no discounting. To set, for example, 3%/cycle discounting, use $model->DiscountRate(.03);

If no discount rate is given, returns the current discount rate.

EvalMC
Performs a Monte Carlo simulation of a single patient through the model, and returns that patient's cumulative utility and the number of cycles the model ran. The patient begins in the state set by StartingState.

EvalMCStep
Given the current model cycle, evaluates a single step of the Markov model, and returns the patient's new state. Internally continues to track the patient's cumulative utility.

EvalCoh
Performs a cohort simulation of the model and returns the average cumulative utility of a patient in the cohort, and the number of cycles the model ran. The number of patients and their initial state are set with StartingState.

EvalCohStep
Evaluates a single cycle of a cohort simulation. Returns the number of patients who will change states in the next cycle (i.e., if it returns 0, you're at the end of the model run).

PrintCycle
Given a FileHandle object and the cycle, prints the current distribution of patients in the cohort (if a cohort simulation is in progress) or the current state and utility of the patient (if a Monte Carlo simulation is in progress).

PrintMatrix
Given a FileHandle object, prints the model in transition matrix form


REFERENCES

Sonnenberg, F. A. & Beck, J. R. (1993). Markov Models in Medical Decision Making: A Practical Guide. Med. Dec. Making, 13: 322-338.


COPYRIGHT

Copyright (c) 1998 Alan Schwartz <alansz@uic.edu>. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.


REVISION HISTORY

  1. 01
  2. March 1988 - Initial concept.

Programminig
Wy
Wy
yW
Wy
Programming
Wy
Wy
Wy
Wy