Long-Range Dependence of Markov Processes
Long-range dependence in discrete and continuous time Markov chains over a countable state space is defined via embedded renewal processes brought about by visits to a fixed state. In the discrete time chain, solidarity properties are obtained and long-range dependence of functionals are examined. On the other hand, the study of LRD of continuous time chains is defined via the number of visits in a given time interval. Long-range dependence of Markov chains over a non-countable state space is...[Show more]
|Collections||Open Access Theses|
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.