首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The analysis of clustered binary data is a common task in many areas of application. Parametric approaches to the analysis of such data are numerous, but there has been much recent interest in nonparametric and semiparametric approaches. When cluster sizes are unequal, an assumption is often made of compatibility of marginal distributions in order for semiparametric approaches to be developed when there is little replication for different cluster sizes. Here, we use the marginal compatibility assumption to extend flexible semiparametric Bayesian methods able to shrink towards a “parametric backbone” to the situation where there are few replicated observations for distinct cluster sizes and each distinct value of a covariate. A motivating application is the analysis of developmental toxicology data where pregnant laboratory animals are exposed to a dose of some potentially toxic compound and interest lies in describing the distribution, as a function of the dose level, of the number of fetuses exhibiting some characteristic abnormality. Flexible semiparametric methods are required here, as the data typically exhibit overdispersion and complex structure. We also consider a further extension appropriate to the analysis of clustered binary data in the situation where there is little or no replication for distinct covariate values.  相似文献   

2.
Many biological phenomena undergo developmental changes in time and space. Functional mapping, which is aimed at mapping genes that affect developmental patterns, is instrumental for studying the genetic architecture of biological changes. Often biological processes are mediated by a network of developmental and physiological components and, therefore, are better described by multiple phenotypes. In this article, we develop a multivariate model for functional mapping that can detect and characterize quantitative trait loci (QTLs) that simultaneously control multiple dynamic traits. Because the true genotypes of QTLs are unknown, the measurements for the multiple dynamic traits are modeled using a mixture distribution. The functional means of the multiple dynamic traits are estimated using the nonparametric regression method, which avoids any parametric assumption on the functional means. We propose the profile likelihood method to estimate the mixture model. A likelihood ratio test is exploited to test for the existence of pleiotropic effects on distinct but developmentally correlated traits. A simulation study is implemented to illustrate the finite sample performance of our proposed method. We also demonstrate our method by identifying QTLs that simultaneously control three dynamic traits of soybeans. The three dynamic traits are the time-course biomass of the leaf, the stem, and the root of the whole soybean. The genetic linkage map is constructed with 950 microsatellite markers. The new model can aid in our comprehension of the genetic control mechanisms of complex dynamic traits over time.  相似文献   

3.
Covariance structure modeling plays a key role in the spatial data analysis. Various parametric models have been developed to accommodate the idiosyncratic features of a given dataset. However, the parametric models may impose unjustified restrictions to the covariance structure and the procedure of choosing a specific model is often ad hoc. To avoid the choice of parametric forms, we propose a nonparametric covariance estimator for the spatial data, as well as its extension to the spatio-temporal data based on the class of space-time covariance models developed by Gneiting (J. Am. Stat. Assoc. 97:590–600, 2002). Our estimator is obtained via a nonparametric approximation of completely monotone functions. It is easy to implement and our simulation shows it outperforms the parametric models when there is no clear information on model specification. Two real datasets are analyzed to illustrate our approach and provide further comparison between the nonparametric estimator and parametric models.  相似文献   

4.
Measurements of both continuous and categorical outcomes appear in many statistical problems. One such example is the study of teratology and developmental toxicity, where both the probability that a live fetus is malformed (ordinal) or of low birth weight (continuous) are important measures in the context of teratogenicity. Although multivariate methods of the analysis of continuous outcomes are well understood, methods for jointly continuous and discrete outcomes are less familiar. We propose a likelihood-based method that is an extension of the Plackett-Dale approach. Specification of the full likelihood will be avoided using pseudo-likelihood methodology. The estimation of safe dose levels as part of quantitative risk assessment will be illustrated based on a developmental toxicity experiment of diethylene glycol dimethyl ether in mice.  相似文献   

5.
When toxicity data are not available for a chemical mixture of concern, U.S. Environmental Protection Agency (EPA) guidelines allow risk assessment to be based on data for a surrogate mixture considered “sufficiently similar” in terms of chemical composition and component proportions. As a supplementary approach, using statistical equivalence testing logic and mixed model theory we have developed methodology to define sufficient similarity in dose—response for mixtures of many chemicals containing the same components with different ratios. Dose—response data from a mixture of 11 xenoestrogens and the endogenous hormone, 17ß-estradiol are used to illustrate the method.  相似文献   

6.
For mixtures of many chemicals, a ray design based on a relevant, fixed mixing ratio is useful for detecting departure from additivity. Methods for detecting departure involve modeling the response as a function of total dose along the ray. For mixtures with many components, the interaction may be dose dependent. Therefore, we have developed the use of a three-segment model containing both a dose threshold and an interaction threshold. Prior to the dose threshold, the response is that of background; between the dose threshold and the interaction threshold, an additive relationship exists; the model allows for departure from additivity beyond the interaction threshold. With such a model, we can conduct a hypothesis test of additivity, as well as a test for a region of additivity. The methods are illustrated with cytotoxicity data that arise when Chinese hamster ovary cells are exposed to a mixture of nine haloacetic acids.  相似文献   

7.
An important environmental and regulatory issue is the protection of human health from potential adverse effects of cumulative exposure to multiple chemicals. Earlier literature suggested restricting inference to specific fixed-ratio rays of interest. Based on appropriate definitions of additivity, single chemical data are used to predict the relationship among the chemicals under the zero-interaction case. Parametric comparisons between the additivity model and the model fit along the fixed-ratio ray(s) are used to detect departure from additivity. Collection of data along reduced fixed-ratio rays, where subsets of chemicals of interest are removed from the mixture and the remaining compounds are at the same relative ratios as considered in the full ray, allow researchers to make inference about the effect of the removed chemicals. Methods for fitting simultaneous confidence bands about the difference between the best fitting model and the model predicted under additivity are developed to identify regions along the rays where significant interactions occur. This general approach is termed the “single chemicals required” (SCR) method of analysis. A second approach, termed “single chemicals not required” (SCNR) method of analysis, is based on underlying assumptions about the parameterization of the response surface. Under general assumptions, polynomial terms for models fit along fixed-ratio rays are associated with interaction terms. Consideration is given to the case where only data along the mixture rays are available. Tests of hypotheses, which consider interactions due to subsets of chemicals, are also developed.  相似文献   

8.
When an interaction has been detected among the chemicals in a mixture, it may be of interest to predict the interaction threshold. A method is presented for estimation of an interaction threshold along a mixture ray which allows differences in the shapes of the dose-response curves of the individual components (e.g., mixtures of full and partial agonists with differing response maxima). A point estimate and confidence interval for the interaction threshold may be estimated. The methods are illustrated with data from a study of a mixture of 18 polyhalogenated aromatic hydrocarbons (PHAHs) in rats exposed by oral gavage for four consecutive days. Serum total thyroxine (T4) was the response variable. Previous analysis of these data demonstrated a dose-dependent interaction among the 18 chemicals in the mixture, with additivity suggested in the lower portion of the dose-response curve and synergy (greater than additive response) in the higher portion of the dose-response curve. The present work builds on this analysis by construction of an interaction threshold model along the mixture ray. This interaction threshold model has two components: an implicit additivity region and an explicit region that describes the departure from additivity; the interaction threshold is the boundary between the two regions. Estimation of the interaction threshold within the observed experimental region suggested evidence of additivity in the low dose region. Total doses of the mixture that exceed the upper limit of the confidence interval on the interaction threshold were associated with a greater-than-additive interaction.  相似文献   

9.
Pectenotoxins (PTXs) accumulate in shellfish feeding on dinoflagellates of the genus Dinophysis, so that humans can be exposed to these toxins through shellfish consumption. Some PTXs are toxic to experimental animals, whereas others are of much lower toxicity. Pectenotoxin-2, the most abundant PTX from most Dinophysis spp., is rapidly metabolized by most shellfish to a mixture of pectenotoxin-2 seco acid (2) and 7-epi-pectenotoxin-2 seco acid (1). A mixture of 1 and 2 was produced during purification of an extract from in vitro enzymatic hydrolysis of pectenotoxin-2. These were separated by preparative HPLC, and the structure of 1 was confirmed by one- and two-dimensional 1H and 13C NMR spectroscopy and LC-MS3 analyses. No toxic changes were recorded in mice injected intraperitoneally with 1 or 2 at a dose of 5000 microg/kg. PTX seco acids are therefore unlikely to be of consequence to human consumers at the concentrations found in contaminated shellfish.  相似文献   

10.
Background, Aims, and Scope  There is an increasing demand for controlled toxicity tests to predict biological effects related to sediment metal contamination. In this context, questions of metal-specific factors, sensitivity of toxicity endpoints, and variability in exposure duration arise. In addition, the choice of the dose metrics for responses is equally important and is related to the applicability of the concept of critical body residue (CBR) in exposure assessments, as well as being the main focus of this study. Methods  Experiments were conducted to assess toxicity of Cd, Cr, Cu and Pb to the oligochaete worm Lumbriculus variegatus with the aim of determining CBRs for two response metrics. Mortality and feeding activity of worms exposed to sediment-spiked metals were used as end-points in connection with residue analyses from both the organisms and the surrounding media. Results  LC50 values were 0.3, 1.4, 5.2, and 6.7 mg/L (from 4.7 μmol/L to 128.0 μmol/L), and the order of toxicity, from most toxic to least toxic, was Cu > Cd > Pb>Cr. By relating toxicity to body residue, variability in toxicity among the metals decreased and the order of toxicity was altered. The highest lethal residue value was obtained for Cu (10.8 mmol/kg) and the lowest was obtained for Cd (2.3 mmol/kg). In the 10-d sublethal test, both time and metal exposure were an important source of variation in the feeding activity of worms. The significant treatment effects were observed from worms exposed to Cd or Pb, with the controls yielding the highest feeding rate. However, quantitative changes in the measured endpoint did not correlate with the exposure concentrations or body residues, which remained an order of magnitude lower than in the acute exposures. Discussion  Both response metrics were able to detect a toxic effect of the metals. However, the ranking of metal toxicity was dependant on the choice of the dose metric used. An attempt to form a causal mortality-mediated link between tissue residues and metal toxicity was successful in water-only exposures. The results also indicated that egestion rate was a sensitive toxicity end point for predicting the effects of sediment contamination. Conclusions  By relating the biological response with the tissue metal residues, toxicity data was comparable to both environmental media as well as different response metrics and time scales. The results also revealed the importance of metal toxicity ranking on a molar basis and, furthermore, a direct link to the CBR concept was established. Recommendations and Perspectives  There is a growing demand for methods to assess the effects of contaminated sediments to benthic fauna and whole aquatic ecosystems. Such information is needed for sediment quality guidelines that are currently being developed in many countries and remediation processes. The use of body residues as a dose metric in metal toxicity studies may help to overcome difficulties related to bioavailability issues commonly faced in sediment toxicity studies. ESS-Submission Editor: Prof. Dr. Henner Hollert (henner.hollert@bio5.rwth-aachen.de)  相似文献   

11.
We develop a method for multiscale estimation of pollutant concentrations, based on a nonparametric spatial statistical model. We apply this method to estimate nitrate concentrations in groundwater over the mid-Atlantic states, using measurements gathered during a period of 10 years. A map of the fine-scale estimated nitrate concentration is obtained, as well as maps of the estimated county-level average nitrate concentration and similar maps at the level of watersheds and other geographic regions. The fine-scale and coarse-scale estimates arise naturally from a single model, without refitting or ad hoc aggregation. As a result, the uncertainty associated with each estimate is available, without approximations relying on high spatial density of measurements or parametric distributional assumptions.  相似文献   

12.
The analysis of telemetry data is common in animal ecological studies. While the collection of telemetry data for individual animals has improved dramatically, the methods to properly account for inherent uncertainties (e.g., measurement error, dependence, barriers to movement) have lagged behind. Still, many new statistical approaches have been developed to infer unknown quantities affecting animal movement or predict movement based on telemetry data. Hierarchical statistical models are useful to account for some of the aforementioned uncertainties, as well as provide population-level inference, but they often come with an increased computational burden. For certain types of statistical models, it is straightforward to provide inference if the latent true animal trajectory is known, but challenging otherwise. In these cases, approaches related to multiple imputation have been employed to account for the uncertainty associated with our knowledge of the latent trajectory. Despite the increasing use of imputation approaches for modeling animal movement, the general sensitivity and accuracy of these methods have not been explored in detail. We provide an introduction to animal movement modeling and describe how imputation approaches may be helpful for certain types of models. We also assess the performance of imputation approaches in two simulation studies. Our simulation studies suggests that inference for model parameters directly related to the location of an individual may be more accurate than inference for parameters associated with higher-order processes such as velocity or acceleration. Finally, we apply these methods to analyze a telemetry data set involving northern fur seals (Callorhinus ursinus) in the Bering Sea. Supplementary materials accompanying this paper appear online.  相似文献   

13.
Evolution of toxicity upon hydrolysis of fenoxaprop-p-ethyl   总被引:2,自引:0,他引:2  
Hydrolysis of fenoxaprop-p-ethyl (FE), a widely used herbicide, was studied in aqueous buffer solutions at pH ranging from 4.0 to 10.0. The degradation kinetics, strongly dependent on pH values, followed first-order kinetics. FE was relatively stable in neutral media, whereas it degraded rapidly with decreasing or increasing pH. In acidic conditions (pH = 4, 5), the benzoxazolyl-oxy-phenyl ether linkage of FE was cleaved to form ethyl 2-(4-hydroxyphenoxy)propanoate (EHPP) and 6-chloro-2,3-dihydrobenzoxazol-2-one (CDHB). While in basic conditions (pH = 8, 9, 10), herbicidal activity fenoxaprop-p (FA) was formed via breakdown of the ester bond of the herbicide. Both the two pathways were concurrent in neutral conditions (pH = 6, 7). Toxicity studies on Daphnia magna showed that FE was most toxic to D. magna with 48 h EC(50) of 14.3 micromol/L, followed by FA (43.8 micromol/L), CDHB (49.8 micromol/L), and EHPP (333.1 micromol/L). Mode of toxic action analysis indicated that EHPP exhibited toxicity via polar narcosis, whereas CDHB belonged to reactive acing compound. The mixture toxicity of CDHB and EHPP was nonadditive and can be predicted by a response addition model. Therefore, the evaluation of overall FE toxicity to D. magna in the aquatic systems needs to consider the degradation of FE.  相似文献   

14.
Modeling complex collective animal movement presents distinct challenges. In particular, modeling the interactions between animals and the nonlinear behaviors associated with these interactions, while accounting for uncertainty in data, model, and parameters, requires a flexible modeling framework. To address these challenges, we propose a general hierarchical framework for modeling collective movement behavior with multiple stages. Each of these stages can be thought of as processes that are flexible enough to model a variety of complex behaviors. For example, self-propelled particle (SPP) models (e.g., Vicsek et al. in Phys Rev Lett 75:1226–1229, 1995) represent collective behavior and are often applied in the physics and biology literature. To date, the study and application of these models has almost exclusively focused on simulation studies, with less attention given to rigorously quantifying the uncertainty. Here, we demonstrate our general framework with a hierarchical version of the SPP model applied to collective animal movement. This structure allows us to make inference on potential covariates (e.g., habitat) that describe the behavior of agents and rigorously quantify uncertainty. Further, this framework allows for the discrete time prediction of animal locations in the presence of missing observations. Due to the computational challenges associated with the proposed model, we develop an approximate Bayesian computation algorithm for estimation. We illustrate the hierarchical SPP methodology with a simulation study and by modeling the movement of guppies.Supplementary materials accompanying this paper appear online.  相似文献   

15.
Mark-resight designs for estimation of population abundance are common and attractive to researchers. However, inference from such designs is very limited when faced with sparse data, either from a low number of marked animals, a low probability of detection, or both. In the Greater Yellowstone Ecosystem, yearly mark-resight data are collected for female grizzly bears with cubs-of-the-year (FCOY), and inference suffers from both limitations. To overcome difficulties due to sparseness, we assume homogeneity in sighting probabilities over 16 years of bi-annual aerial surveys. We model counts of marked and unmarked animals as multinomial random variables, using the capture frequencies of marked animals for inference about the latent multinomial frequencies for unmarked animals. We discuss undesirable behavior of the commonly used discrete uniform prior distribution on the population size parameter and provide OpenBUGS code for fitting such models. The application provides valuable insights into subtleties of implementing Bayesian inference for latent multinomial models. We tie the discussion to our application, though the insights are broadly useful for applications of the latent multinomial model.  相似文献   

16.
An aquatic hazard assessment of contaminatedgroundwater in a surficial aquifer was conducted at Beach Pointwhich is located in the Edgewood Area of the U.S. Army Garrison,Aberdeen Proving Ground, Maryland. Toxicity was detected atvarious groundwater concentrations by seven of 10 toxicity testsystems exposed to a mixture of heavy metals and chlorinatedaliphatic hydrocarbons. When estimated maximum acceptabletoxicant concentrations (MATC) were established, the data foralgae, invertebrates, and fish suggested that the groundwaterwould not be harmful at a concentration of 10% groundwater byvolume. Likewise, no genotoxicity (Ames and SEC assays),development toxicity (FETAX), or chronic histopathology (9-monthfish test) occurred at 10% groundwater by volume.Near-field (ULINE model) and far-field (dye-tracer model)screening level dilution models were run to estimate the dilutionof the groundwater discharge plume from Beach Point into the BushRiver. The groundwater was considered to be a potentiallyexcessive hazardous material to the biota in the Bush River whena number of conservative assumptions regarding contaminantdistribution and discharge rate of the aquifer were used in thehazard assessment. By modeling the groundwater emanating fromBeach Point as the dilution of a discharge from a line diffuser,the potential water quality impacts were judged to be minimal ifState of Maryland surface water discharge criteria for a mixingzone were used for the discharge of groundwater to the Bush River.  相似文献   

17.
Ranked set sampling is a sampling approach that could lead to improved statistical inference when the actual measurement of the variable of interest is difficult or expensive to obtain but sampling units can be easily ordered by some means without actual quantification. In this paper, we consider the problem of bootstrapping an unbalanced ranked set sample (URSS) where the number of observations from each artificially created stratum can be unequal. We discuss resampling a URSS through transforming it into a balanced RSS and extending the existing algorithms. We propose two methods that are designed to obtain resamples from the given URSS. Algorithms are provided and several properties, including asymptotic normality of estimates, are discussed. The proposed methods are compared with the parametric bootstrap using Monte Carlo simulations for the problem of testing a hypothesis about the population mean.  相似文献   

18.
The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference is implemented using Markov chain Monte Carlo (MCMC) methods to obtain efficient estimates of spatial clustering parameters. Uncertainty is addressed using parametric bootstrap or by consideration of posterior distributions in a Bayesian setting. Maximum likelihood estimation and Bayesian inference are compared in an example concerning minke whales in the northeast Atlantic.  相似文献   

19.
Inorganic toxicants in soils of North Rhine-Westphalia. I. Evaluation of multi-modal frequency distributions In 335 soil samples from North Rhine-Westphalia the contents of 18 elements were quantified (total contents, mobilizable fraction (extraction with EDTA cocktail), mobile fraction (extraction with NH4NO3). An inspection of the resulting histograms revealed that the distribution of element contents in soils is not normal and even not log-normal. The appearance of several peaks indicates the existence of multi-modal distributions with various independent sub-distributions. Two methods for analysing such data sets are presented. The parametric approach assumes several single lognormal components, which are fitted by the Expectation Maximization Algorithm. In a nonparametric approach the Kernel Density Estimation is used. With this procedure five to ten concentration levels can be determined per element for our data set with corresponding groups of soil samples being characterized by common features (e.g., parent material, region of origin, specific source of pollution). This way, e.g., background concentrations for potentially toxic elements can be derived. The parametric approach is lacking sharpness of separation in the case of more than four subdistributions.  相似文献   

20.
The radioactive liquid waste (RLW) system in Ontario Hydro's pressurised heavy water reactors collects drainage from a variety of sources ranging from floor drains to laundry waste. RLW effluent was intermittently toxic to rainbow trout andDaphnia magna during the first phase of Ontario's Municipal Industrial Strategy for Abatement (MISA) Program, apparently as a result of the interaction of a variety of known and unknown organic and inorganic compounds. Accordingly, we employed a tmatment-based approach to reducing its toxicity, supplemented by chemical analysis. Two series of toxicity reduction tests were conducted. The fast series explored the potential for sorption of the possible toxicants, while the accord series incorporated a wider variety of treatments. Of the 24 samples in the first test series, 17 were toxic (D. magna mortality ≥ 50%). Of the toxic samples, only 7 of 17 were still toxic after passage through an activated carbon column, but 5 of 6 samples tested remained toxic after passage through a metal chelating resin column. In the second series, at least one of the treatments was effective in reducing toxicity of all samples which were initially toxic (16 of 24 samples), but no one treatment was effective for all toxic samples. Three treatments (UV/H2O2 photo-oxidation with prior pH adjustment, or passage through a column of either a non-functioalized (N-F) resin or a mixture of N-F resin and a weak base (W-B) anion exchange rain), were effective in reducing the toxicity of more than 50% of the toxic samples; yet roughly 25% of these samples remained toxic after treatment O2 sparging, UV/H2O2 photo-oxidation without prior pH adjustment, and passage through a column of the W-B Resin were less effective, as more than 50% of the samples remained toxic after treatment. Filtering was not effective, as all of the treated samples (9/9) retained their toxicity. There was no obvious correspondence between toxicity and the concentrations of metals (Cu, Zn, Fe, Al and Cd) nor were any simple relationships apparent between toxicity and Total Organic Carbon or NH3 concentrations. At stations where radioactive liquid wastes are segregated, toxicity was also segregated, suggesting that we may be able to address the problem at source through a combination of Best Management Practices and smaller scale treatment facilities.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号