首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Capture–recapture (CR) models assume marked individuals remain at risk of capture, which may not be true if individuals lose their mark or emigrate definitively from the study area. Using a double-marking protocol, with a main and auxiliary mark, and both live encounters and dead recoveries at a large scale, partially frees CR models from this assumption. However, the auxiliary mark may fall off and its presence is often not mentioned when dead individuals are reported. We propose a new model to deal with heterogeneity of detection and uncertainty of the presence of an auxiliary mark in a multi-event framework. Our general model, based on a double-marking protocol, uses information from physical captures/recaptures, distant observations and main mark recoveries from dead animals. We applied our model to a 13-year data set of a harvested species, the Greater Snow Goose. We obtained seasonal survival estimates for adults of both sexes. Survival estimates differed between models where the presence of the auxiliary mark upon recovery was ignored versus those where the presence was accounted for. In the multi-event framework, seasonal survival estimates are no longer biased because the heterogeneity due to the presence of an auxiliary mark is accounted for in the estimation of recovery rates.  相似文献   

2.
The models presented in this paper are motivated by a stop-over study of semipalmated sandpipers, Calidris pusilla. Two sets of data were collected at the stop-over site: a capture–recapture–resighting data set and a vector of counts of unmarked birds. The two data sets are analyzed simultaneously by combining a new model for the capture–recapture–resighting data set with a binomial likelihood for the counts. The aim of the analysis is to estimate the total number of birds that used the site and the average duration of stop-over. The combined analysis is shown to be highly efficient, even when just 1 % of birds are recaptured, and is recommended for similar investigations. This article has supplementary material online.  相似文献   

3.
The Cormack–Jolly–Seber (CJS) model assumes that all marked animals have equal recapture probabilities at each sampling occasion, but heterogeneity in capture often occurs and should be taken into account to avoid biases in parameter estimates. Although diagnostic tests are generally used to detect trap-dependence or transience and assess the overall fit of the model, heterogeneity in capture is not routinely tested for. In order to detect and identify this phenomenon in a CJS framework, we propose a test of positive association between previous and future encounters using Goodman–Kruskal’s gamma. This test is based solely on the raw capture histories and makes no assumption on model structure. The development of the test is motivated by a dataset of Sandwich terns (Thalasseus sandvicensis), and we use the test to formally show that they exhibit heterogeneity in capture. We use simulation to assess the performance of the test in the detection of heterogeneity in capture, compared to existing and corrected diagnostic goodness-of-fit tests, Leslie’s test of equal catchability and Carothers’ extension of the Leslie test. The test of positive association is easy to use and produces good results, demonstrating high power to detect heterogeneity in capture. We recommend using this new test prior to model fitting as the outcome will guide the model-building process and help draw more accurate biological conclusions. Supplementary materials accompanying this paper appear online.  相似文献   

4.
Biologists often use more than one marking technique in wildlife studies. For each of the mark types, it is common to conduct a separate analysis of the recapture data to estimate parameters of interest, such as survival rates. Two data types that can be used in estimating survival rates are resighting and radiotelemetry data. The Cormack-Jolly-Seber model is commonly used to analyze the resighting data, while the Kaplan-Meier product limit estimator, modified for staggered entry of animals, is used to analyze the radi otelemetry data. In a study where some animals receive two types of tags and others receive just one tag type, the separate Cormack-Jolly-Seber and Kaplan-Meier analyses do not exploit all of the information in the combined data sets. In this article, we propose a model and likelihood for the combined analysis of resighting and radi otelemetry data. In comparison with the separate analyses, this richer model provides more information about the biology and sampling processes. For example, the richer model permits assessment of assumptions required by the separate analyses and allows estimation of additional parameters. We apply the model to annual resighting and monthly telemetry data from a population of snail kites in Florida. The snail kite is a threatened species of bird in the United States, and our results on survival are very important. In this example, all birds are marked using leg bands and some of them receive radios.  相似文献   

5.
The Census of Agriculture is conducted every 5 years, in years ending in 2 and 7. The Census list frame is incomplete, resulting in undercoverage. Not all operations on the list frame respond, and, based on the response, some misclassification occurs. In 2012, a capture–recapture analysis was conducted to adjust for undercoverage, nonresponse, and misclassification. This was the first time capture–recapture methods were used to produce official statistics for an establishment survey. The number of records on the Census Mailing List that were classified as farms was 1,382,099, and the published estimate of the number of farms was 2,109,303, a 34.5% adjustment. The adjustment was greatest for farms with low production levels and for specialty farms, both of which are difficult to identify and add to the list. The methods used are described. Challenges that arose in the implementation process are discussed. Areas for enhancement being targeted for the 2017 Census of Agriculture are highlighted. Supplementary materials accompanying this paper appear online.  相似文献   

6.
After oil spills, oiled wildlife are regularly cleaned at considerable cost. Yet the conservation value of this intervention has been questioned, mostly because cleaned animals appear to have poor post-release survival. However, reliable long-term data are needed to judge the success of such programs. We used 16 ring recoveries to estimate survival of 932 Cape gannets (Morus capensis) that were oiled, cleaned and released in 1983. For the period 1989-2006, we further compared survival of 162 surviving cleaned gannets to that of 10,558 non-oiled gannets using capture-recapture data. We used modern statistical tools that account for recapture probabilities, recovery rates, transience, and temporary absence from the breeding colonies. Mean annual survival rates of de-oiled gannets ranged from 0.84 (se = 0.05) to 0.88 (se = 0.02), depending on analysis and colony. Between 1989 and 2006, rehabilitated gannets survived slightly less well than unoiled birds, but the difference was similar to the difference in survival between the two colonies where the birds were studied. Our results show subtle long-term effects of oiling and subsequent treatment. However, they also show that cleaned gannets can survive at almost the same rate as unoiled birds, at least if they survived the initial years after release. Rehabilitation of these birds may thus be a valuable conservation intervention for this localised species where a single spill can threaten a large proportion of the world population.  相似文献   

7.
Owing to habitat conversion and conflict with humans, many carnivores are of conservation concern. Because of their elusive nature, camera trapping is a standard tool for studying carnivores. In many vertebrates, sex-specific differences in movements – and therefore detection by cameras – are likely. We used camera trapping data and spatially explicit sex-specific capture–recapture models to estimate jaguar density in Emas National Park in the central Brazilian Cerrado grassland, an ecological hotspot of international importance. Our spatially explicit model considered differences in movements and trap encounter rate between genders and the location of camera traps (on/off road). We compared results with estimates from a sex-specific non-spatial capture–recapture model. The spatial model estimated a density of 0.29 jaguars 100 km−2 and showed that males moved larger distances and had higher trap encounter rates than females. Encounter rates with off-road traps were one tenth of those for on-road traps. In the non-spatial model, males had a higher capture probability than females; density was estimated at 0.62 individuals 100 km−2. The non-spatial model likely overestimated density because it did not adequately account for animal movements. The spatial model probably underestimated density because it assumed a uniform distribution of jaguars within and outside the reserve. Overall, the spatial model is preferable because it explicitly considers animal movements and allows incorporating site-specific and individual covariates. With both methods, jaguar density was lower than reported from most other study sites. For rare species such as grassland jaguars, spatially explicit capture–recapture models present an important advance for informed conservation planning.  相似文献   

8.
Traditional analyses of capture–recapture data are based on likelihood functions that explicitly integrate out all missing data. We use a complete data likelihood (CDL) to show how a wide range of capture–recapture models can be easily fitted using readily available software JAGS/BUGS even when there are individual-specific time-varying covariates. The models we describe extend those that condition on first capture to include abundance parameters, or parameters related to abundance, such as population size, birth rates or lifetime. The use of a CDL means that any missing data, including uncertain individual covariates, can be included in models without the need for customized likelihood functions. This approach also facilitates modeling processes of demographic interest rather than the complexities caused by non-ignorable missing data. We illustrate using two examples, (i) open population modeling in the presence of a censored time-varying individual covariate in a full robust design, and (ii) full open population multi-state modeling in the presence of a partially observed categorical variable. Supplemental materials for this article are available online.  相似文献   

9.
Abstract

Samples of vegetation and faeces from heavily grazed pastures are likely to be contaminated with soil to some extent. The determination of total potassium on these samples will result in low recoveries if a wet oxidation procedure normally used for vegetation analysis is employed. However, data are presented which show that for the sites studied satisfactory recoveries of total potassium are obtained following a sulphuric acid‐hydrogen peroxide digestion, providing the ash content of the sample does not exceed 25%. For ash contents greater than 25%, alkaline fusion or hydrofluoric/perchloric acid decomposition procedures are required to obtain full recovery.  相似文献   

10.
Wheat flours were stored at room temperature (15–25°C), 40, 60, 80, and 100°C for various times. The baking performance of these flours was then evaluated in terms of the springiness of pancakes (recovery from crushing). Baking performance improved with increased storage time at each temperature. Brabender Amylograph tests of the flours indicated that the onset temperature in viscosity decreased with increased storage time at each temperature. When the flours were fractionated by acetic acid (pH 3.5) with mortar and pestle, recoveries of the water-solubles and gluten fractions were unchanged, but recoveries of prime starch and tailings fractions changed remarkably with increased storage time. On the other hand, those changes were not observed when flours were fractionated with a Waring blender. The binding of prime starch to tailings was correlated significantly with baking performance.  相似文献   

11.
A continuation of an earlier interlaboratory comparison was conducted (1) to assess solid-phase extraction (SPE) using Empore disks to extract atrazine, bromacil, metolachlor, and chlorpyrifos from various water sources accompanied by different sample shipping and quantitative techniques and (2) to compare quantitative results of individual laboratories with results of one common laboratory. Three replicates of a composite surface water (SW) sample were fortified with the analytes along with three replicates of deionized water (DW). A nonfortified DW sample and a nonfortified SW sample were also extracted. All samples were extracted using Empore C(18) disks. After extraction, part of the samples were eluted and analyzed in-house. Duplicate samples were evaporated in a 2-mL vial, shipped dry to a central laboratory (SDC), redissolved, and analyzed. Overall, samples analyzed in-house had higher recoveries than SDC samples. Laboratory x analysis type and laboratory x water source interactions were significant for all four compounds. Seven laboratories participated in this interlaboratory comparison program. No differences in atrazine recoveries were observed from in-house samples analyzed by laboratories A, B, D, and G compared with the recovery of SDC samples. In-house atrazine recoveries from laboratories C and F were higher when compared with recovery from SDC samples. However, laboratory E had lower recoveries from in-house samples compared with SDC samples. For each laboratory, lower recoveries were observed for chlorpyrifos from the SDC samples compared with samples analyzed in-house. Bromacil recovery was <65% at two of the seven laboratories in the study. Bromacil recoveries for the remaining laboratories were >75%. Three laboratories showed no differences in metolachlor recovery; two laboratories had higher recoveries for samples analyzed in-house, and two other laboratories showed higher metolachlor recovery for SDC samples. Laboratory G had a higher recovery in SW for all four compounds compared with DW. Other laboratories that had significant differences in pesticide recovery between the two water sources showed higher recovery in DW than in the SW regardless of the compound. In comparison to earlier work, recovery of these compounds using SPE disks as a temporary storage matrix may be more effective than shipping dried samples in a vial. Problems with analytes such as chlorpyrifos are unavoidable, and it should not be assumed that an extraction procedure using SPE disks will be adequate for all compounds and transferrable across all chromatographic conditions.  相似文献   

12.
A prototype multiresidue method based on fast extraction and dilution of samples followed by flow injection mass spectrometric analysis is proposed here for high-throughput chemical screening in complex matrices. The method was tested for sulfonylurea herbicides (triflusulfuron methyl, azimsulfuron, chlorimuron ethyl, sulfometuron methyl, chlorsulfuron, and flupyrsulfuron methyl), carbamate insecticides (oxamyl and methomyl), pyrimidine carboxylic acid herbicides (aminocyclopyrachlor and aminocyclopyrachlor methyl), and anthranilic diamide insecticides (chlorantraniliprole and cyantraniliprole). Lemon and pecan were used as representative high-water and low-water content matrices, respectively, and a sample extraction procedure was designed for each commodity type. Matrix-matched external standards were used for calibration, yielding linear responses with correlation coefficients (r) consistently >0.99. The limits of detection (LOD) were estimated to be between 0.01 and 0.03 mg/kg for all analytes, allowing execution of recovery tests with samples fortified at ≥0.05 mg/kg. Average analyte recoveries obtained during method validation for lemon and pecan ranged from 75 to 118% with standard deviations between 3 and 21%. Representative food processed fractions were also tested, that is, soybean oil and corn meal, yielding individual analyte average recoveries ranging from 62 to 114% with standard deviations between 4 and 18%. An intralaboratory blind test was also performed; the method excelled with 0 false positives and 0 false negatives in 240 residue measurements (20 samples × 12 analytes). The daily throughput of the fast extraction and dilution (FED) procedure is estimated at 72 samples/chemist, whereas the flow injection mass spectrometry (FI-MS) throughput could be as high as 4.3 sample injections/min, making very efficient use of mass spectrometers with negligible instrumental analysis time compared to the sample homogenization, preparation, and data processing steps.  相似文献   

13.
We found evidence that tagging induced trap shyness in snapper (Pagrus auratus), i.e., tagged fish had a reduced probability of recapture by the method by which they had originally been caught. Tagging experiments in 1985 and 1994 involving single release over a short period and single recapture were conducted on a closed population (SNA 1: East Northland-Hauraki Gulf-Bay of Plenty). Initial capture was by trawling and by line fishing, while recapture over an extended period included other methods. A test for trap shyness that removed the possible effects of spatial and fish size heterogeneity gave a significant result for both years. The data suggested that the trap shyness effect might have been smaller for trawl-released fish than for line-released fish. However, we estimated a single trap shyness factor (0.71). There was also some evidence for attenuation of trap shyness over time.  相似文献   

14.
A simple, rapid enzyme-linked immunoassay (ELISA) was used to evaluate the performance of each step (extraction, filtration, solvent partition, and silica gel column chromatography) of a solvent-efficient thin-layer chromatographic (TLC) method which is undergoing interlaboratory collaborative study for the determination of aflatoxin B1 in corn, raw peanuts, and peanut butter. The apparent average recoveries using the ELISA method were about 30 to 50% higher than those using the TLC method if only the amount of B1 added to the samples was used in the calculations. After the cross-reaction of the antibody with other aflatoxins added to the samples was considered, the amounts recovered approached the levels of aflatoxins added in all 3 commodities tested. With no cleanup treatment, ELISA recoveries at aflatoxin B1 levels above 7.5 ng/g were 84, 79, and 103% for corn, raw peanuts, and peanut butter, respectively. The coefficients of variation were between 5.2 and 25.2%. With each cleanup step in the TLC method, ELISA detected a progressive decrease in recovery from 150.5 to 105.3% (before correction for the presence of other aflatoxins) or from 93.5 to 65.4% (after correction for other aflatoxins) of B1 added to the samples. The ELISA data support the conclusion obtained from previous studies that cleanup treatments were not necessary in the ELISA. When large amounts of other aflatoxins are present, an understanding of the cross-reactivity of antibody with other aflatoxins in the ELISA is essential for final interpretation of the data.  相似文献   

15.
We develop a novel modeling strategy for analyzing data with repeated binary responses over time as well as time-dependent missing covariates. We assume that covariates are missing at random (MAR). We use the generalized linear mixed logistic regression model for the repeated binary responses and then propose a joint model for time-dependent missing covariates using information from different sources. A Monte Carlo EM algorithm is developed for computing the maximum likelihood estimates. We propose an extended version of the AIC criterion to identify the important factors that m a y explain the binary responses. A real plant dataset is used to motivate and illustrate the proposed methodology.  相似文献   

16.
A method for the extraction of bentazone, dichlorprop, and MCPA in three selected Norwegian soils of different textures is described. Initially three different extraction methods were tested on one soil type. All methods gave recoveries >80% for the pesticide mixture, but extraction with sodium hydroxide in combination with solid-phase preconcentration was used for further recovery tests with soils of different properties spiked at four herbicide concentration levels (0.001-10 microg/g of wet soil). The method was rapid and easy and required a minimum of organic solvents. The recoveries were in the range of 82-109, 80-123, and 45-91% for the soils containing 1.4 (Hole), 2.5 (Kroer), and 37.8% (Froland) organic carbon, respectively. Limits of quantification using GC-MS were 0.0003 microg/g of wet soil for bentazone and 0.0001 microg/g of wet soil for both dichlorprop and MCPA.  相似文献   

17.
Classical metapopulation theory (CMT) has proven an attractive paradigm for ecologists concerned with the conservation of aquatic-breeding amphibians, given its apparent fit with the population dynamics of these animals, and the opportunities the concept provides to assess alternate management options. Nevertheless, several authors have cautioned against uncritical application of this paradigm. We assessed the application of CMT to the conservation of the endangered growling grass frog (Litoria raniformis) in the urbanising landscapes of Melbourne, Victoria, Australia. Support for five predictions developed from the basic tenets of CMT was assessed using a multi-year occupancy and mark–recapture data-set. There was congruence between all five predictions and data. Wetland occupancy was strongly influenced by the proximity of neighbouring populations (‘connectivity’), but the estimated rate of dispersal between wetlands was low. Wetland occupancy was also temporally dynamic, with only a weak effect of connectivity on the probability of extinction, but a strong positive influence of connectivity on the probability of colonisation. Our work confirms that CMT provides a useful model of the dynamics of L. raniformis in urbanising landscapes, and justifies the application of the paradigm to conservation planning for this species. We argue that CMT may prove relevant to numerous aquatic-breeding amphibians, and encourage assessment of the application of CMT to the conservation of these animals.  相似文献   

18.
A rapid, ion-exchange liquid chromatographic method for the determination of nitrate and nitrite in biological fluids is presented. Samples are deproteinated by ultrafiltration followed by removal of chloride using a silver form cation-exchange resin. Nitrate and nitrite are measured by ion-exchange liquid chromatography with conductivity detection. Recoveries from serum, ocular fluid, and water were determined for fortifications from 10 to 150 mg/L. Average recoveries ranged from 96 to 104% for nitrate and from 89 to 105% for nitrite. Pooled RSD values ranged between 1.5 and 1.9% for these analytes in all matrixes examined. The method of joint confidence hexagons was applied to the data to determine constant and relative bias of the method for each of the 3 matrixes in the study.  相似文献   

19.
The perch population of Lake Vähä Valkjärvi, a two hectare clear-water lake in southern Finland, decreased due to acid precipitation during the 1980s. During the early 1990s a decrease in acidic deposition resulted in slight improvement of water quality of the lake. This was followed by recovery of the reproduction of perch starting in 1991. A mark and recapture experiment in spring 1995 indicated a hundred fold increase in the population size of perch in a four year period. A decrease in the abundance of aquatic invertebrates was recorded during 1989–1996. This decrease well coincided with the recovery of the perch population, suggesting that increased predation by fish was responsible for the decrease. The occurrence of goldeneye young also dropped in L. Vähä Valkjärvi since 1993. This was thought to be due to increased food competition with perch.  相似文献   

20.
In proteomic studies, a population of proteins are often examined on a gel using a technique called two-dimensional gel eletrophoresis. The technique separates the protein population into individual protein spots on a two-dimensional gel by isoelectric charge and molecular weight. The resulting gel images are then processed by a software system for spot detection and subsequent analysis. The performance of a spot-detection program is evaluated by the total number of spots that are detected. A popular spot-detection program uses the “master–slave” approach, where all spots on “slave images” are subsets of the spots on the “master image.” We argue that this approach potentially misses a large proportion of proteins and propose a model that quantifies the lack of performance. We provide nonparametric estimators for the protein population size and the expected number of proteins to be detected if a “fusion-gel” approach was used. Using the data from a rat liver proteome study, we estimate that more than half of the protein population is missed by the master–slave approach.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号