首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A semi-quantitative model is presented to rank freshwater rainbow trout farms within a country or region with regards to the risk of becoming infected and spreading a specified pathogen. The model was developed to support a risk-based surveillance scheme for notifiable salmonid pathogens. Routes of pathogen introduction and spread were identified through a process of expert consultation in a series of workshops. The routes were combined into themes (e.g. exposure via water, mechanical transmission). Themes were weighted based on expert opinion. Risk factors for each route were scored and combined into a theme score which was adjusted by the weight. The number of sources and consignments were used to assess introduction via live fish movements onto the farm. Biosecurity measures were scored to assess introduction on fomites. Upstream farms, wild fish and processing plants were included in assessing the likelihood of introduction by water. The scores for each theme were combined to give separate risk scores for introduction and spread. A matrix was used to combine these to give an overall risk score. A case study for viral haemorrhagic septicaemia is presented. Nine farms that represent a range of farming practices of rainbow trout farms in England and Wales are used as worked examples of the model. The model is suited to risk rank freshwater salmonid farms which are declared free of the pathogen(s) under consideration. The score allocated to a farm does not equate to a quantitative probability estimate of the farm to become infected or spread infection. Nevertheless, the method provides a transparent approach to ranking farms with regards to pathogen transmission risks. The output of the model at a regional or national level allows the allocation of surveillance effort to be risk based. It also provides fish farms with information on how they can reduce their risk score by improving biosecurity. The framework of the model can be applied to different production systems which may have other routes of disease spread. Further work is recommended to validate the allocated scores. Expert opinion was obtained through workshops, where the outputs from groups were single point estimates for relative weights of risks. More formal expert opinion elicitation methods could be used to capture variation in the experts' estimates and uncertainty and would provide data on which to simulate the model stochastically. The model can be downloaded (in Microsoft(?)-Excel format) from the Internet at: http://www.cefas.defra.gov.uk/6701.aspx.  相似文献   

2.
Certification that a country, region or state is "free" from a pathogen or has a prevalence less than a threshold value has implications for trade in animals and animal products. We develop a Bayesian model for assessment of (i) the probability that a country is "free" of or has an animal pathogen, (ii) the proportion of infected herds in an infected country, and (iii) the within-herd prevalence in infected herds. The model uses test results from animals sampled in a two-stage cluster sample of herds within a country. Model parameters are estimated using modern Markov-chain Monte Carlo methods. We demonstrate our approach using published data from surveys of Newcastle disease and porcine reproductive and respiratory syndrome in Switzerland, and for three simulated data sets.  相似文献   

3.
Samples from livestock or food items are often submitted to microbiological analysis to determine whether or not the group (herd, flock or consignment) is shedding or is contaminated with a bacterial pathogen. This process is known as 'herd testing' and has traditionally involved subjecting each sample to a test on an individual basis. Alternatively one or more pools can be formed by combining and mixing samples from individuals (animals or items) and then each pool is subjected to a test for the pathogen. I constructed a model to simulate herd-level sensitivity of the individual-sample approach (HSe) and the herd-level sensitivity of the pooled-sample approach (HPSe) of tests for detecting pathogen. The two approaches are compared by calculating the relative sensitivity (RelHSe = HPSe/HSe). An assumption is that microbiological procedures had 100% specificity. The new model accounts for the potential for HPSe and RelHSe to be reduced by the dilution of pathogen that occurs when contaminated samples are blended with pathogen-free samples. Key inputs include a probability distribution describing the concentration of the pathogen of interest in samples, characteristics of the pooled-test protocol, and a 'test-dose-response curve' that quantifies the relationship between concentration of pathogen in the pool and the probability of detecting the target organism. The model also compares the per-herd cost of the pooled-sample and individual-sample approaches to herd testing. When applied to the example of Salmonella spp. in cattle feces it was shown that a reduction in the assumed prevalence of shedding can cause a substantial fall in HPSe and RelHSe. However, these outputs are much less sensitive to changes in prevalence when the number of samples per pool is high, or when the number of pools per herd-test is high, or both. By manipulating the number of pools per herd and the number of samples per pool HPSe can be optimized to suit the range of values of true prevalence of shedding of Salmonella that are likely to be encountered in the field.  相似文献   

4.
International trade of livestock and livestock products poses a significant potential threat for spread of diseases, and importing countries therefore often require that imported animals and products are free from certain pathogens. However, absolute freedom from infection cannot be documented, since all test protocols are imperfect and can lead to false-negative results. It is possible instead to estimate the "probability of freedom from infection" and its opposite, the probability of infection despite having a negative test result. These probabilities can be estimated based on a pre-defined target prevalence, known surveillance efforts in the target population and known test characteristics of any pre-export test. Here, calculations are demonstrated using the example of bovine herpes virus-1 (BoHV-1). In a population that recently became free of BoHV-1 without using vaccination, the probability of being infected of an animal randomly selected for trade is 800 per 1 million and this probability is reduced to 64 (95% probability interval PI 6-161) per 1 million when this animal is tested negatively prior to export with a gB-ELISA. In a population that recently became free of BoHV-1 using vaccination, the probability of being infected of an animal randomly selected for trade is 200 per 1 million, and this probability can be reduced to 63 (95% PI 42-87) when this animal is tested negatively prior to export with a gE-ELISA. Similar estimations can be made on a herd level when assumptions are made about the herd size and the intensity of the surveillance efforts. Subsequently, the overall probability for an importing country of importing at least 1 infected animal can be assessed by taking into account the trade volume. Definition of the acceptable level of risk, including the probability of false-negative results to occur, is part of risk management. Internationally harmonized target prevalence levels for the declaration of freedom from infection from selected pathogens provide a significant contribution to the facilitation of international trade of livestock and livestock products by allowing exporting countries to design tailor-made output-based surveillance programs, while providing equivalent guarantees regarding the probability of freedom from infection of the population. Combining this with an approach to assess the overall probability of introducing at least 1 infected animal into an importing country during a defined time interval will help importing countries to achieve their desired level of acceptable risk and will help to assess the equivalence of animal health and food safety standards between trading partners.  相似文献   

5.
Abstract

Intense infections of the gill pathogen Dermocystidium salmonis were associated with mortality of prespawning chinook salmon Oncorhynchus tshawytscha in several Oregon rivers in 1988. The occurrence of the pathogen in returning adult chinook salmon was monitored in several coastal Oregon stocks from 1989 to 1993. Although the prevalence of the pathogen was high in these fish (up to 66.6%), infection intensities were generally low, and no mortality attributable to D. salmonis was observed. In 1988, the pathogen was associated with a lethal epizootic among juvenile chinook salmon smolts at the Trask State Fish Hatchery near Tillamook, Oregon. Histological examination of gills from heavily infected fish revealed hyperplasia of gill epithelium and fusion of gill lamellae. When naturally infected smolts were transferred from fresh to salt water, the most heavily infected fish died within 10 d, and the number of D. salmonis cysts declined and disappeared from previously infected salmon after 21–42 d.  相似文献   

6.
Long-term Salmonella Dublin carrier animals harbor the pathogen in lymph nodes and internal organs and can periodically shed bacteria through feces or milk, and contribute to transmission of the pathogen within infected herds. Thus, it is of great interest to reduce the number of new carrier animals in cattle herds. An observational field study was performed to evaluate factors affecting the risk that dairy cattle become carrier animals after infection with Salmonella Dublin. Based on repeated sampling, cattle in 12 Danish dairy herds were categorized according to course of infection, as either carriers (n = 157) or transiently infected (n = 87). The infection date for each animal was estimated from fecal excretion and antibody responses. The relationship between the course of infection (carrier versus transiently infected) and risk factors were analyzed using a random effect multilevel, multivariable logistic regression model. The animals with the highest risk of becoming carriers were heifers infected between the age of 1 year and 1st calving, and cows infected around the time of calving. The risk was higher in the first two quarters of the year (late Winter to Spring), and when the prevalence of potential shedders in the herd was low. The risk also varied between herds. The herds with the highest risk of carrier development were herds with clinical disease outbreaks during the study period. These findings are useful for future control strategies against Salmonella Dublin, because they show the importance of optimized calving management and management of heifers, and because they show that even when the herd prevalence is low, carriers are still being produced. The results raise new questions about the development of the carrier state in cattle after infection with low doses of Salmonella Dublin.  相似文献   

7.
Tick-borne diseases are of increasing concern in many countries, particularly as a consequence of changes in land use and climate. Ticks are vectors of numerous pathogens (viruses, bacteria, protozoa) that can be harmful to humans and animals. In the context of animal health, bovine babesiosis poses a recurrent threat to cattle herds. In this study, we use a modeling approach to investigate the spread of babesiosis and evaluate control measures. A previously developed tick population dynamics model (here, Ixodes ricinus) is coupled with a pathogen spread model (here, the protozoan Babesia divergens), which describes pathogen spread in a dairy herd through the following processes: transmission, acquisition, transovarial transmission, transstadial persistence, and clearance of the pathogen. An assessment of the simulated B. divergens prevalence levels in ticks and cattle in the context of existing knowledge and data suggested that the model provides a realistic representation of pathogen spread. The model was then used to evaluate the influence of host density and the effect of acaricides on B. divergens prevalence in cattle. Increasing deer density results in an increase in prevalence in cattle whereas increasing cattle stocking rate results in a slight decrease. A potential increase in deer density would thus have an amplification effect on disease spread due to the increase in the number of infected ticks. Regular use of acaricides produces a reduction in pathogen prevalence in cattle. This model could be adapted to other tick-borne diseases.  相似文献   

8.
Balanced autosomal translocations are a known cause for repeated early embryonic loss (REEL) in horses. In most cases, carriers of such translocations are phenotypically normal, but the chromosomal aberration negatively affects gametogenesis giving rise to both genetically balanced and unbalanced gametes. The latter, if involved in fertilization, result in REEL, whereas gametes with the balanced form of translocation will pass the defect into next generation. Therefore, in order to reduce the incidence of REEL, identification of translocation carriers is critical. Here, we report about a phenotypically normal 3‐year‐old Arabian mare that had repeated resorption of conceptuses prior to day 45 of gestation and was diagnosed with REEL. Conventional and molecular cytogenetic analyses revealed that the mare had normal chromosome number 64,XX but carried a non‐mosaic and non‐reciprocal autosomal translocation t(4;10)(q21;p15). This is a novel translocation described in horses with REEL and the first such report in Arabians. Previous cases of REEL due to autosomal translocations have exclusively involved Thoroughbreds. The findings underscore the importance of routine cytogenetic screening of breeding animals.  相似文献   

9.
Chronic wasting disease (CWD) is a fatal disease of North American cervids that was first detected in a wild, hunter-shot deer in Saskatchewan along the border with Alberta in Canada in 2000. Spatially explicit models for assessing factors affecting disease detection are needed to guide surveillance and control programs. Spatio-temporal patterns in CWD prevalence can be complicated by variation in individual infection probability and sampling biases. We assessed hunter harvest data of mule deer (Odocoileus hemionus) and white-tailed deer (Odocoileus virginianus) during the early phases of an outbreak in Saskatchewan (i.e., 2002-2007) for targeting the detection of CWD by defining (1) where to look, and (2) how much effort to use. First, we accounted for known demographic heterogeneities in infection to model the probability, P(E), that a harvested deer was infected with CWD given characteristics of the harvest location. Second, in areas where infected deer were harvested we modelled the probability, P(D), of the hunter harvest re-detecting CWD within sample units of varying size (9-54 km(2)) given the demographics of harvested deer and time since first detection in the study area. Heterogeneities in host infection were consistent with those reported elsewhere: mule deer 3.7 times >white-tailed deer, males 1.8 times>females, and quadratically related to age in both sexes. P(E) increased with number of years since the first detection in our study area (2002) and proximity to known disease sources, and also varied with distance to the South Saskatchewan River and small creek drainages, terrain ruggedness, and extent of agriculture lands within a 3 km radius of the harvest. The majority (75%) of new CWD-positive deer from our sample were found within 20 km of infected deer harvested in the previous year, while approximately 10% were greater than 40 km. P(D) modelled at 18 km(2) was best supported, but for all scales, P(D) depended on the number of harvested deer and time since the first infected deer was harvested. Within an 18 km(2) sampling unit, there was an 80% probability of detecting a CWD-positive deer with 16 harvested deer five years after the initial infected harvest. Identifying where and how much to sample to detect CWD can improve targeted surveillance programs early in the outbreak of the disease when based on hunter harvest.  相似文献   

10.
Testing of pooled samples has been proposed as a low-cost alternative for diagnostic screening and surveillance for infectious agents in situations where the prevalence of infection is low and most samples can be expected to test negative. The present study extends our previous work in pooled-sample testing (PST) to evaluate effects of the following factors on the overall PST sensitivity (SEk) and specificity (SPk): dilution (pool size), cross-contamination, and cross-reaction. A probabilistic model, in conjunction with Monte Carlo simulations, was used to calculate SEk and SPk, as applied to detection of bovine viral diarrhea virus (BVDV) persistently infected (PI) animals using RT-PCR. For an average prevalence of BVDV PI of 0.01 and viremia in each animal between 102 and 107 virus particles/mL, the pool size associated with the lowest number of tests, and lowest cost, corresponded to eight samples/pool. However, the least-cost pool size (lowest number of tests) was associated with a SEk of 0.90 (0.75–1), which corresponded to a decrease of 0.04, relative to the assay sensitivity for a single sample. The SPk for the same pool size, considering the effect of detection of BVDV acutely infected animals and cross-contamination as source of false positive results, was 0.90 (0.85–0.95). The effect of a hypothetical cross-reacting agent was to markedly decrease SPk, especially as the prevalence of the cross-reacting agent increased. For a pool size of eight samples and a prevalence of the cross-reacting agent of 0.3, SPk ranged from 0.67 to 0.86, depending on the probability that the assay would detect the cross-reacting agent. The methods presented offer a means of evaluating and understanding the various factors that can influence overall accuracy of PST procedures.  相似文献   

11.
We developed the BSurvE spreadsheet model to estimate the true prevalence of bovine spongiform encephalopathy (BSE) in a national cattle population, and evaluate national BSE surveillance programs. BSurvE uses BSE surveillance data and demographic information about the national cattle population. The proportion of each cohort infected with BSE is found by equating the observed number of infected animals with the number expected, following a series of probability calculations and assuming a binomial distribution for the number of infected animals detected in each surveillance stream. BSurvE has been used in a series of international workshops, where analysis of national datasets demonstrated patterns of cohort infection that were consistent with infection-control activities within the country. The results also reflected the timing of known events that were high-risk for introduction of the infectious agent.  相似文献   

12.
The national control programme for Salmonella in Danish swine herds introduced in 1993 has led to a large decrease in pork-associated human cases of salmonellosis. The pork industry is increasingly focused on the cost-effectiveness of surveillance while maintaining consumer confidence in the pork food supply. Using national control programme data from 2003 and 2004, we developed a zero-inflated binomial model to predict which farms were most at risk of Salmonella. We preferentially sampled these high-risk farms using two sampling schemes based on model predictions resulting from a farm's covariate pattern and its random effect. Zero-inflated binomial modelling allows assessment of similarities and differences between factors that affect herd infection status (introduction), and those that affect the seroprevalence in infected herds (persistence and spread). Both large (producing greater than 5000 pigs per annum), and small herds (producing less than 2000 pigs per annum) were at significantly higher risk for infection and subsequent seroprevalence, when compared with medium sized herds (producing between 2000 and 5000 pigs per annum). When compared with herds being located elsewhere, being located in the south of Jutland significantly decreased the risk of herd infection, but increased the risk of a pig from an infected herd being seropositive. The model suggested that many of the herds where Salmonella was not detected were infected, but at a low prevalence. Using cost and sensitivity, we compared the results of our model based sampling schemes with those under the standard sampling scheme, based on herd size, and the recently introduced risk-based approach. Model-based results were less sensitive but show significant cost savings. Further model refinements, sampling schemes and the methods to evaluate their performance are important areas for future work, and these should continue to occur in direct consultation with Danish authorities.  相似文献   

13.
Heifers (n=136) from 5 herds were treated with a commercially available beta-lactam intramammary (IMM) antibiotic preparation containing cephapirin sodium at 10-21 d prior to anticipated parturition to evaluate the risk of antibiotic residues occurring in milk postpartum and to determine factors associated with antibiotic residues and IMM pathogen presence in milk postpartum. Mammary secretions collected from quarters before antibiotic administration and during weeks 1, 2 and 3 postpartum were analyzed for mastitis pathogens. Composite milk was collected at milkings 3, 6 and 10 postpartum and analyzed for beta-lactam residues using a microbial inhibition antibiotic residue screening test. Antibiotic residues were confirmed with beta-lactamase treatment and re-tested for residues. Residues were detected in 28.0, 8.82 and 3.68% of milk samples obtained at the third, sixth, and tenth milking postpartum, respectively. Increases in interval between prepartum antibiotic therapy and parturition and an increase in the postpartum interval to sampling were associated with a decrease in risk of antibiotic residues. The presence of antibiotic residues in milk at the third milking was associated with a reduced risk for IMM pathogen prevalence in the first 21 d postpartum. Lower somatic cell counts, an increase in mean milk yield over 200 days in milk and a reduction in IMM pathogen prevalence were associated with the presence of an antibiotic in milk postpartum. Screening milk for antibiotic residues in milk postpartum following prepartum antibiotic therapy in heifers is recommended to reduce the risk for antibiotic residue contamination of milk.  相似文献   

14.

Background

In recent years, the occurrence and the relevance of Mycoplasma hyopneumoniae infections in suckling pigs has been examined in several studies. Whereas most of these studies were focused on sole prevalence estimation within different age groups, follow-up of infected piglets or assessment of pathological findings, none of the studies included a detailed analysis of individual and environmental risk factors. Therefore, the aim of the present study was to investigate the frequency of M. hyopneumoniae infections in suckling pigs of endemically infected herds and to identify individual risk factors potentially influencing the infection status of suckling pigs at the age of weaning.

Results

The animal level prevalence of M. hyopneumoniae infections in suckling pigs examined in three conventional pig breeding herds was 3.6% (41/1127) at the time of weaning. A prevalence of 1.2% was found in the same pigs at the end of their nursery period. In a multivariable Poisson regression model it was found that incidence rate ratios (IRR) for suckling pigs are significantly lower than 1 when teeth grinding was conducted (IRR: 0.10). Moreover, high temperatures in the piglet nest during the first two weeks of life (occasionally >40°C) were associated with a decrease of the probability of an infection (IRR: 0.23-0.40). Contrary, the application of PCV2 vaccines to piglets was associated with an increased infection risk (IRR: 9.72).

Conclusions

Since single infected piglets are supposed to act as initiators for the transmission of this pathogen in nursery and fattening pigs, the elimination of the risk factors described in this study should help to reduce the incidence rate of M. hyopneumoniae infections and thereby might contribute to a reduced probability of high prevalences in older pigs.  相似文献   

15.

Background

Mycoplasma hyopneumoniae is the etiologic agent of enzootic pneumonia mainly occurring in fattening pigs. It is assumed that horizontal transmission of the pathogen during nursery and growing phase starts with few suckling pigs vertically infected by the sow. The aim of the present study was the exploration of the herd prevalence of M. hyopneumoniae infections in suckling pigs followed by an investigation of various herd specific factors for their potential of influencing the occurrence of this pathogen at the age of weaning.

Results

In this cross-sectional study, 125 breeding herds were examined by taking nasal swabs from 20 suckling pigs in each herd. In total, 3.9% (98/2500) of all nasal swabs were tested positive for M. hyopneumoniae by real-time PCR. Piglets tested positive originated from 46 different herds resulting in an overall herd prevalence of 36.8% (46/125) for M. hyopneumoniae infection in pigs at the age of weaning. While the herds were epidemiologically characterized, the risk for demonstration of M. hyopneumoniae was significantly increased, when the number of purchased gilts per year was more than 120 (OR: 5.8), and when the number of farrowing pens per compartment was higher than 16 (OR: 3.3). In herds with a planned and segregated production, where groups of sows entered previously emptied farrowing units, the risk for demonstration of M. hyopneumoniae in piglets was higher in herds with two or four weeks between batches than in herds with one or three weeks between batches (OR: 2.7).

Conclusions

In this cross-sectional study, several risk factors could be identified enhancing the probability of breeding herds to raise suckling pigs already infected with M. hyopneumoniae at the time of weaning. Interestingly, some factors (farrowing rhythm, gilt acclimatisation issues) were overlapping with those also influencing the seroprevalences among sows or the transmission of the pathogen between older age groups. Taking the multifactorial character of enzootic pneumonia into account, the results of this study substantiate that a comprehensive herd specific prevention programme is a prerequisite to reduce transmission of and disease caused by M. hyopneumoniae.  相似文献   

16.
We used Monte Carlo simulation to estimate distributions for flock-level sensitivity of abattoir-based surveillance for ovine paratuberculosis as currently practised in New South Wales, Australia. Probability distributions were used as input variables for within-flock prevalence, years-infected and individual animal-level sensitivity and specificity of gross pathology as a screening test for the presence of paratuberculosis. Distributions used as inputs for the size of abattoir-slaughter groups were based on existing abattoir-surveillance data from NSW. Predicted flock-level sensitivity depended on within-flock prevalence and the number of animals examined and was sensitive to estimates of animal-level sensitivity and specificity. The median probability of detection of an infected flock based on the examination of one abattoir line was predicted not to exceed 0.95 unless the within-flock prevalence was ≥7%. If the within-flock prevalence was 2%, the probability distribution of flock-level sensitivity had a median of 0.73, with 80% of values lying between 0.55 and 0.84. Improvement in the flock-level sensitivity could be achieved by submitting more than three gross pathology-positive specimens per line, if available—but the degree of improvement depended on the number of sheep slaughtered (line size) and the within-flock prevalence. At 2% prevalence, a median flock-level sensitivity of 0.95 could be obtained in lines of >390 sheep if six gross pathology-positive specimens were submitted. We concluded that abattoir surveillance based on identification of gross pathology as a screening test is not a sensitive tool for detecting recently infected flocks or flocks which have a moderate or lower prevalence of infected animals. But—with relatively minor modifications of the protocol currently in use—it could become a key component of a surveillance programme which included additional testing strategies for small flocks.  相似文献   

17.
Abstract

We provide the computer program code to estimate pathogen prevalence and calculate confidence intervals for estimates based on maximum likelihood methods for the open-source statistical and graphics package R and a commercially licensed statistical package, the Statistical Analysis System (SAS). We correct a previously published SAS program to allow use of newer versions of the SAS software and provide a second SAS program that will work in either version of SAS. All of the programs allow users to calculate prevalence from any number of pooled samples representing different numbers of individuals, and two of these programs allow users to make estimates from data pools that are entirely test-positive or test-negative.  相似文献   

18.
ABSTRACT: The control of highly infectious diseases of livestock such as classical swine fever, foot-and-mouth disease, and avian influenza is fraught with ethical, economic, and public health dilemmas. Attempts to control outbreaks of these pathogens rely on massive culling of infected farms, and farms deemed to be at risk of infection. Conventional approaches usually involve the preventive culling of all farms within a certain radius of an infected farm. Here we propose a novel culling strategy that is based on the idea that farms that have the highest expected number of secondary infections should be culled first. We show that, in comparison with conventional approaches (ring culling), our new method of risk based culling can reduce the total number of farms that need to be culled, the number of culled infected farms (and thus the expected number of human infections in case of a zoonosis), and the duration of the epidemic. Our novel risk based culling strategy requires three pieces of information, viz. the location of all farms in the area at risk, the moments when infected farms are detected, and an estimate of the distance-dependent probability of transmission.  相似文献   

19.
Pathogens such as Escherichia coli O157:H7 and Campylobacter spp. have been implicated in outbreaks of food poisoning in the UK and elsewhere. Domestic animals and wildlife are important reservoirs for both of these agents, and cross-contamination from faeces is believed to be responsible for many human outbreaks. Appropriate parameterisation of quantitative microbial-risk models requires representative data at all levels of the food chain. Our focus in this paper is on the early stages of the food chain-specifically, sampling issues which arise at the farm level. We estimated animal–pathogen prevalence from faecal-pat samples using a Bayesian method which reflected the uncertainties inherent in the animal-level prevalence estimates. (Note that prevalence here refers to the percentage of animals shedding the bacteria of interest). The method offers more flexibility than traditional, classical approaches: it allows the incorporation of prior belief, and permits the computation of a variety of distributional and numerical summaries, analogues of which often are not available through a classical framework. The Bayesian technique is illustrated with a number of examples reflecting the effects of a diversity of assumptions about the underlying processes. The technique appears to be both robust and flexible, and is useful when defecation rates in infected and uninfected groups are unequal, where population size is uncertain, and also where the microbiological-test sensitivity is imperfect. We also investigated the determination of the sample size necessary for determining animal-level prevalence from pat samples to within a pre-specified degree of accuracy.  相似文献   

20.
Epizootic haematopoietic necrosis virus (EHNV) is an iridovirus that affects perch (Perca fluviatilis) and rainbow trout (Oncorhynchus mykiss). It emerged in Australia in the 1980s and has not been discovered elsewhere. It causes a high level of mortality in perch resulting in steep population declines. The main possible routes of introduction of the virus to England and Wales are the importation of infected live fish or carcasses. However, no trade in live susceptible species is permitted under current legislation, and no importation of carcasses currently takes place. The virus is hardy and low levels of challenge can infect perch. Therefore, mechanical transmission through the importation of non-susceptible fish species should be considered as a potential route of introduction and establishment. Carp (Cyprinus carpio) have been imported to the UK from Australia for release into still-water fisheries. A qualitative risk assessment concluded that the likelihood of EHNV introduction and establishment in England and Wales with the importation of a consignment of carp was very low. The level of uncertainty at a number of steps in the risk assessment scenario tree was high, notably the likelihood that carp become contaminated with the virus and whether effective contact (resulting in pathogen transmission) is made between the introduced carp and susceptible species in England and Wales. The virus would only establish when the water temperature is greater than 12 °C. Analysis of 10 years of data from two rivers in south-west England indicated that establishment could occur over a period of at least 14 weeks a year in southern England (when average water temperature exceed 12 °C). Imports of live fish from Australia need to be evaluated on a case-by-case basis to determine which, if any, sanitary measures are required to reduce the assessed risk to an acceptable level.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号