Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 3

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  censoring
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
EN
In this work, we introduce a method of estimating stochastic freeway capacity using elements of both extreme value theory and survival analysis. First, we define capacity data, or estimates of the capacity of the roadway, as the daily maximum flow values. Then, under a survival analysis premise, we introduce censoring into our definition. That is, on days when flows are sufficiently high and congestion occurs, corresponding flow maxima are considered true estimates of capacity; otherwise, for those days that do not observe high flows or congestion, flow maxima are deemed censored observations and capacities must be higher than the observations. By extreme value theory, the collection of flow maxima (block maxima) can be appropriately approximated with a generalized extreme value (GEV) distribution. Because of small sample sizes and the presence of censoring, a Bayesian framework is pursued for model fitting and parameter estimation. To lend credence to our proposed methodology, the procedure is applied to real-world traffic stream data collected by the New Hampshire Department of Transportation (NHDOT) at a busy location on Interstate I-93 near Salem, New Hampshire. Data were collected over a period of 11 months and raw data were aggregated into 15-minute intervals. To assess our procedure, and to provide proof of concept, several validation procedures are presented. First, using distinct training and validation subsets of our data, the procedure yields accurate predictions of highway capacity. Next, our procedure is applied to a training set to yield random capacities which are then used to predict breakdown in the validation set. The frequency of these predicted breakdowns is found to be statistically similar to observed breakdowns observed in our validation set. Lastly, after comparing our methodology to other methods of stochastic capacity estimation, we find our procedure to be highly successful.
2
Content available remote On the informative value of the largest sample element of log-Gumbel distribution
EN
Extremes of stream flow and precipitation are commonly modeled by heavy-tailed distributions. While scrutinizing annual flow maxima or the peaks over threshold, the largest sample elements are quite often suspected to be low quality data, outliers or values corresponding to much longer return periods than the obser-vation period. Since the interest is primarily in the estimation of the right tail (in the case of floods or heavy rainfalls), sensitivity of upper quantiles to largest elements of a series constitutes a problem of special concern. This study investigated the sen-sitivity problem using the log-Gumbel distribution by generating samples of different sizes (n) and different values of the coefficient of variation by Monte Carlo ex-periments. Parameters of the log-Gumbel distribution were estimated by the prob-ability weighted moments (PWMs) method, method of moments (MOMs) and maximum likelihood method (MLM), both for complete samples and the samples deprived of their largest elements. In the latter case, the distribution censored by the non-exceedance probability threshold, FT , was considered. Using FT instead of the censored threshold T creates possibility of controlling estimator property. The effect of the FT value on the performance of the quantile estimates was then examined. It is shown that right censoring of data need not reduce an accuracy of large quantile estimates if the method of PWMs or MOMs is employed. Moreover allowing bias of estimates one can get the gain in variance and in mean square error of large quantiles even if ML method is used.
3
Content available remote Transformed diffeomorphic kernel estimation of hazard rate function
EN
In the article, a transformed diffeomorphic kernel estimator of the hazard rate function in the presence of censoring is constructed. The estimator is defined in the framework of multiplicative intensity point process model. It is shown that the proposed estimator is asymptotically unbiased, consistent and asymptotically normal. Analysis in the reduction of the bias of the diffeomorphic estimator is carried out. Some simulation results comparing the obtained estimator with the Ramlau-Hansen estimator are also presented.
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.