Matematik / Matematisk statistik / SF2937 |

Current information
Solved problem 4.2 where we used the interpretation of the Birnbaum measure as the probability that the component is critical. Solved problem 4.5 which showed how we can tailor make a importance measure - in this case to select a component to check (maintain) given that the system works. The Vesely-Fussells measure, i.e. the probability that a minimal cut that contains the component has failed given that thye system has failed. Solved problem 4.4, old exam problem about Vesely-Fussells (and about associated variables), 4.7 and 4.11 a. Next time question time and/or an old exam.
Component importance. Descibed structural importance and Birnbaum's measure, i.e. the probability that the component is critical. Showed that the structural importance can be calculated using the Birnbaum measure by letting all probabilities be 1/2. Analyzed the system in problem 4.1 and in some simple situations like series system and parallel system. Described criticality importance which is the probability that the component is critical and has fail given that the system has failed. Solved parts of 4.3 and calculated Birnbaum's measure mått and Criticality importance.
On renewal processes as a generalization of the Poisson process where the times between events have an arbitrary (positive) distribution. Solved problems 7.1 and 7.2. The chapter on renewal theory will be largely ignored apart from the results about (approximate) expectation and variance for the number of renewas and the fact that they are normally distributed. Solved problems 7.5 and 7.8.
Associated variables. Solved problems 6.1, 6.2, 6.3, 6.4, and 6.8. Indicated how we can prove the theorem that for binanary X and Y then C(X,Y)≥0 implies that X and Y are associated by testing all possible combinations. We start with component importance next time.
Treated dependence between components and especially the concept of associated variables, which reflects a "positive" dependence between components. Showed a number of theorems concerning associated variables. We especially concentrated on finding inequalities for the function probabilities for coherent systems consisting of associated components. Solved an old exam problem as an application of these estimates.
Dual systems were introduced by solving problem 3.10. Showed that dual systems can be obtained by exchanging parallel and series connections. Solved problem 3.14 as an illustration of how to go from structure function to fault tree. Solved problems 3.17, 3.16, 3.18, 3.22.
Showed that componentwise redundancy is more reliable than systemwise redundancy. Solved problems 3.8 and 3.9 as an illustration of how to "invert" a structure function, i.e. go from a structure funtion to a blockstructure. Solved 3.1 a to practice how to get minimal paths and minimal cuts. Introduced fault trees and did the MOCUS-algorithm according to the example in the textbook which shows how to go from fault tree to structure function. A bit about independent components and showed that if the structure function is in basic form (no nonlinear terms) we replace the binary variables with the probabilities. Solved problem 3.13 as an illustration.
Finished the section om Markov models by solving problem 5.8. Chapter 3 in the text book about structure functions and reliability block diagrams especially section 3.5. About structure functions (binary functions of binary arguments). Defined relevant components and coherent systems. Exemplified with series systems, parallel systems and 2 out of 3 systems. Defined paths, cuts and minimal paths and minimal cuts. Showed that a coherent system can be expressed as a parallel system of the minimal paths or also by a series structure of the minimal cuts. A bit about pivotal decomposition (split analysis according to if a certain component works or not.) Treated the bridge structure in Example 3.5 in the text book and showed how to use pivotal decomposition. Stressed the importance of having the structural function in basic form (i.e. only linear). Used the minimal paths and minimal cuts to get the structure funtion.
Markov models. Solved 5.7, 5.9, 5.10, 5.12 - did not complete all details, but introduced states and calculated Q and also indicated how to solve the rest of the problem.
Markov models: Discussed the fact that in problem 5.3 the concepts MDT (Mean Down Time) and MTBF (Mean Time Between Failures) are not well-defined. They work if there is only one system failure state. Solved problems 5.4, 5.5a and 5.6 as illustrations of how different situations can med modelled using Markov processes.
Birth-and-deathprocesses and how we can find the stationary distribution algorithmically. In addition I discussed the M/M/1 queueing system, i.e. a system where customers arrive according to a Poisson(λ)-process and (possibly) queue a then are given service which takes an Exp(μ)-distributed time. We found that there exists a stationary distribution if λ<μ and that the process i ergodic. Solved problem 55 (Ehrenfest urn model) and obtained the stationary distribution , which turned out to be Bin(N,1/2) - a result which seems self-evident in retrospect. Markov models. Solved 5.2, 5.3.
The Poisson process. Definition as a Markov process and as a process with increments which have a Poisson distribution and with independents increments on disjoint intervals. I should also have mentionen that a sum of independent Poisson processes is a Poisson process. Solved problems 43, 45, 48 where 48 was about independent p-thinning. Solved problem 52 as an illustration of birth processes. The concept of birth-and-death processes and pointed out that there is an algorithm to find the stationary distribution.
Talked about time to absorbtion and solved those part of problem cencerning this.
Showed how a reliability system as described in problem 5.1 with two components in parallel with exponentially distributed life lengths and exponentially distributed repair times (all assumed to be independent) can be modeled by a Markov process and how easy it is to obtain the intensity matrix Q. In particular we will make arguments where we see what can happen in a short (infinitesimal) time interval h and keep track of probabilities of the order of magnitude of h - these are the ones included in Q. Discussed how we sometimes like in problem 5.1 can lump together states, but cautioned that you have to be sure that the Markov property is not lost. In problem 5.1 we could lump together so that we only keep track of how many components work, i.e. we need only 3 states. The concepts MTTF (Mean Time To Failuire), MTBF (Mean Time Between Failure) and MDT (Mean Down Time) and the relation MTBF=MTTF+MDT. Note as a warning that different books define these concepts ambigously. In addition: These concepts are not always well defined.The concept Asymptotic availability A(∞) and the relation A(∞)=MTTF/MTBF=MTTF/(MTTF+MDT) or equivalently 1-A(∞)=MDT/MTBF=MDT/(MTTF+MDT) Showed how the stationary distribution can be interpreted as a fraction of time spent in different states and how cycle arguments can be used.
Introduction to continuous time. The Markov properety and Chapman-Kolmogorov equations. The concept of regular process, i.e. finite number of transitions in finite time and that the visits are strictly positive. The unconditional probabilities and how these can be obtained from the initial distribution and the transition matrix. Gave an argument which used the Markov property to prove that the visit times must be exponentially distributed, since this is the only memoryless distribution. A major problem is that the transition matrix P(t) is difficult to choose and we therefore introduced its derivative Q (the intensity matrix) which means that P(h)=I+hQ+o(h) for small h where I is the identy matrix. Properties of the matrix Q like the fact that it has row sum 0. Showed how we can get the dynamics from Q. The diagonal elements show the exponential parameter for the exponentially distributed visit time and it then jumbs with probabilities proportional to the non-diagonal elements in Q. Showed the Kolmogorov forward and backward equations. Talked about stationary distributions and ergodicity. Solved problem 34 except about time to absorption.
Treated some graph theoretical aspects, e.g. ''i leads to j'', ''i has (two-way) communication with j'' and also that this is an equivalence relation and therefore splits the state space E into disjoint irreducible (all communicate) subclasses. The concept closed subclass (no arrows point out of the class). Showed that in a finite E there is always at least one closed irreducible subclass. The concept of ergodicity, i.e. the asymptotic behaviour does not depend on the initial distribution. The concept of period of a state and that states which communicate have the same period. Aperiodic state means that the period is 1. The main theorem that a finite chain with only one closed irreducible subclass which is aperiodic is ergodic. The limiting distribution must be the stationary distribution. The covergence is extremely fast. Solved problem 8 as an illustration of how to prove ergodicity.
Discussed cycles and stated that
π
Introduction to Markov processes. Definition of the Markov property (memoryless property). Showed the Chapman-Kolmogorovs equationer. Approximate content about Markov processes. Solved problem 1. Solved problem 16 as an illustration of how simple it is to calculate the transistion matrix from a verbal description of the situation and also about the treatment of absorbtion chains.
The concept of time invariant (stationary) distribution
i.e. probability vector Here is a proof.
Treated the Kaplan-Meier-estimator (product limit estimator) which estimates the survival function for all distributions and when we have Type IV-censring (each single data can be censored independently). Solved problem 2.4 as an illustration. The method is often called survival analysis. Described the Nelsonestimator (which is an alternative to the Kaplan-Meier-estimator when estimating the survival function). Calculated the estimator for the data in 2.4 and described why the results are so similar to the Kaplan-Meier result. SDescribed the methodology in estimating the parameters in the Weibull-distribution when we have Type II-censoring by looking at the ML-equations. Described how Weibull graphical paper can be used to get numerical estimates of the parameters in the Weibull distribution. Next week we treat Markov process.
Analysis of life length data. ML-estimate for complete exponentially distributed data and ML-estimate for Type II censoring (we stop when the r:th failure occurs).
Analys av livslängdsdata. ML-skattning vid fullständiga exponentialfördelade
data samt ML-skattning vid Typ II censurering (avbryter försöket då r:te felet uppstår) without replacement (we do not replace failed components). Showed that in both cases we have the ML-estimate λ
Showed that 2λT(x(r)) is χ
Introduced the concept "Empirical distributionfunction" and showed the
connection between the TTT-plot and the empirical distribution.
The concept TTT-transform H Treated the material on optimal maintanance and using the TTT-plot. Solved problem 2.6 as an application of this. It is also part of the first computer hand-in and these were distributed. Analysis of complete life length data using the ML-estimate for exponentially distributed data.
About analysis of life length data. Described how we can have complete data, data up to a fixed time (type I censoring), have data until a fixed number have failed (type II censoring), and have data until the first of these (type III censoring). The concept "total time on test" (TTT). Described the TTT-plot and solved problem 2.2 and gave some hints .how Matlab can be used for the first computer hand in by solving problem 2.1 using Matlab. Mentioned that the TTT-plot can be used to see if the distribution is IFR, DFR och exponentially distributed.
Solved problem 1.8. The concept NBU (New better than used) i.e. that P(T>x+y)≤P(T>x)P(T>y) and the interpretation as a conditional probability that P(T>x+y|T>x)<P(T>y). Showed that IFRA implies NBU.
Solved 1.6 and 1.7 about minimum of independent Weibull-distributed variables with the The Poisson-process as a model for the concept "events occurring completetly ramdomly in time". Showed that the time between events in a Poisson-process are exponentially distributed. Introduced the Γ(n,λ)-distribution as the distribution of the sum of n independent Exp(λ)-distributed variablers and its density. Showed som properties of the Poissin-process and the exponential distribution, i.e that given that an event has occurred in an interval its position is uniformly distributed.
Solved 1.12 as an illustration of how decreasing failure rate can occur when we have a mixture of good and bad components.
Solved problem 1.3 as an illustration of the formulas for E(T) and
E(T Introduced the class IFRA (the failure rate increases in average) and DFRA (the failure rate decreases in average). Showed that IFR implies IFRA (in the same manner DFR implies DFRA). The reason we want to study an extended class is that systems with independent components which are IFR are not necessarily IFR. An exemple of that a system of independent IFR-componenrs is not not IFR. (Also contains results from a later part of the course that systems of independent IFRA-components are IFRA). Showed the memoryless property of the exponential distribution and the interpretation of this. Furthermore we have P(T>x+y)=P(T>x)P(T>y) or equivalently P(T>x+y|T>x)=P(T>y) and that this is a unique property of the exponential distribution.
Important concepts from the basic course such as probability distribution function, density function, (cumulative) distribution funtion and their relations. Defined the survival funtion and the failure rate and their relation. Bathtyb curve. Solved problem 1.1. Solved most of 1.2, i.e. the failure rate for the Weibull distributionand that is has increasing, constant or decreasing failure rate depending on the value of the form parameter c. Defined the classes IFR (Increasing Failure Rate) and DFR (Decreasing Failure Rate) and that the Weibullf distribution if DFR if the form parameter satifies 0< c≤1 and IFR when c≥1 and both IFR and DFR when c=1 (exponential distribution).
Showed the formula for E(T |

Gunnar Englund Last change: 2010-09-08 |