A pinboard by
Nadine Upton

Dr Nadine Upton, Post-doctoral Life Scientist | Curation made possible by Deep Science Ventures


Follow this board to stay on top of trending and game-changing research papers for drug discovery

Expert curation of this pinboard is made possible by Deep Science Ventures, a partner of Sparrho.

The Deep Science Ventures Community

It's about creating the connections that allow you to turn your vision into reality in the fastest way possible.

Deep Science Ventures is focused on bridging the gaps between industry and academia, different disciplines, and founders and investors, by creating collisions in the least awkward and most effective way possible.

If you are excited about the application of science and want to meet like-minded people in your sector or technology area, you are welcome to join our small dinners, larger events and casual drinks. Leave your email on our website and we'll keep you up to date (around one email / month).

Click here to learn more


Validation of Metagenomic Next-Generation Sequencing Tests for Universal Pathogen Detection.

Abstract: -Metagenomic sequencing can be used for detection of any pathogens using unbiased, shotgun next-generation sequencing (NGS), without the need for sequence-specific amplification. Proof-of-concept has been demonstrated in infectious disease outbreaks of unknown causes and in patients with suspected infections but negative results for conventional tests. Metagenomic NGS tests hold great promise to improve infectious disease diagnostics, especially in immunocompromised and critically ill patients.-To discuss challenges and provide example solutions for validating metagenomic pathogen detection tests in clinical laboratories. A summary of current regulatory requirements, largely based on prior guidance for NGS testing in constitutional genetics and oncology, is provided.-Examples from 2 separate validation studies are provided for steps from assay design, and validation of wet bench and bioinformatics protocols, to quality control and assurance.-Although laboratory and data analysis workflows are still complex, metagenomic NGS tests for infectious diseases are increasingly being validated in clinical laboratories. Many parallels exist to NGS tests in other fields. Nevertheless, specimen preparation, rapidly evolving data analysis algorithms, and incomplete reference sequence databases are idiosyncratic to the field of microbiology and often overlooked.

Pub.: 09 Feb '17, Pinned: 12 Feb '17

Detecting free-living steps and walking bouts: validating an algorithm for macro gait analysis.

Abstract: Research suggests wearables and not instrumented walkways are better suited to quantify gait outcomes in clinic and free-living environments, providing a more comprehensive overview of walking due to continuous monitoring. Numerous validation studies in controlled settings exist, but few have examined the validity of wearables and associated algorithms for identifying and quantifying step counts and walking bouts in uncontrolled (free-living) environments. Studies which have examined free-living step and bout count validity found limited agreement due to variations in walking speed, changing terrain or task. Here we present a gait segmentation algorithm to define free-living step count and walking bouts from an open-source, high-resolution, accelerometer-based wearable (AX3, Axivity). Ten healthy participants (20-33 years) wore two portable gait measurement systems; a wearable accelerometer on the lower-back and a wearable body-mounted camera (GoPro HERO) on the chest, for 1 h on two separate occasions (24 h apart) during free-living activities. Step count and walking bouts were derived for both measurement systems and compared. For all participants during a total of almost 20 h of uncontrolled and unscripted free-living activity data, excellent relative (rho  ⩾  0.941) and absolute (ICC(2,1)  ⩾  0.975) agreement with no presence of bias were identified for step count compared to the camera (gold standard reference). Walking bout identification showed excellent relative (rho  ⩾  0.909) and absolute agreement (ICC(2,1)  ⩾  0.941) but demonstrated significant bias. The algorithm employed for identifying and quantifying steps and bouts from a single wearable accelerometer worn on the lower-back has been demonstrated to be valid and could be used for pragmatic gait analysis in prolonged uncontrolled free-living environments.

Pub.: 13 Dec '16, Pinned: 03 Feb '17

An immunogram for the cancer-immunity cycle: towards personalized immunotherapy of lung cancer.

Abstract: The interaction of immune cells and cancer cells shapes the immunosuppressive tumor microenvironment. For successful cancer immunotherapy, comprehensive knowledge of anti-tumor immunity as a dynamic spacio-temporal process is required for each individual patient. To this end, we developed an immunogram for the cancer-immunity cycle using next-generation sequencing.Whole-exome sequencing and RNA-Seq was performed in 20 non-small cell lung cancer patients (12 adenocarcinoma, 7 squamous cell carcinoma, and 1 large cell neuroendocrine carcinoma). Mutated neoantigens and cancer-germline antigens expressed in the tumor were assessed for predicted binding to patients' HLA molecules. The expression of genes related to cancer immunity was assessed and normalized to construct a radar chart composed of 8 axes reflecting 7 steps in the cancer-immunity cycle.Three immunogram patterns were observed in lung cancer patients: T-cell-rich, T-cell-poor and intermediate. The T cell-rich pattern was characterized by gene signatures of abundant T cells, Tregs and MDSCs, checkpoint molecules and immune-inhibitory molecules in the tumor, suggesting the presence of anti-tumor immunity dampened by an immunosuppressive microenvironment. The T cell-poor phenotype reflected lack of anti-tumor immunity, inadequate DC activation, and insufficient antigen presentation in the tumor. Immunograms for both the adenocarcinoma patients and the non-adenocarcinoma patients included both T cell-rich and T cell-poor phenotypes, suggesting that histology does not necessarily reflect the cancer-immunity status of the tumor.The patient-specific landscape of the tumor microenvironment can be appreciated using immunogram as integrated biomarkers, which may thus become a valuable resource for optimal personalized immunotherapy.

Pub.: 16 Jan '17, Pinned: 01 Feb '17

Effectiveness, safety and costs of orphan drugs: an evidence-based review.

Abstract: Several orphan drugs have been approved by the European Medicines Agency (EMA) over the past two decades. However, the drugs are expensive, and in some instances, the evidence for effectiveness is not convincing at the time of regulatory approval. Our objective was to evaluate the clinical effectiveness of orphan drugs that have been granted marketing licenses in Europe, determine the annual costs of each drug, compare the costs of branded orphan drugs against their generic equivalents, and explore any relationships between orphan drug disease prevalence and annual costs.We searched the EMA database to identify orphan drugs granted marketing authorisation up to April 2014. Electronic searches were also conducted in PubMed, EMBASE and Google Scholar, to assess data on effectiveness, safety and annual costs. 2 reviewers independently evaluated the levels and quality of evidence, and extracted data.We identified 74 orphan drugs, with 54 (73%) demonstrating moderate quality of evidence. 85% showed significant clinical effects, but serious adverse events were reported in 86.5%. Their annual costs were between £726 and £378,000. There was a significant inverse relationship between disease prevalence and annual costs (p = 0.01); this was largely due to the influence of the ultra-orphan diseases. We could not determine whether the balance between effectiveness and safety influenced annual costs. For 10 drugs where generic alternatives were available, the branded drugs were 1.4 to 82,000 times more expensive.The available evidence suggests that there is inconsistency in the quality of evidence of approved orphan drugs, and there is no clear mechanism for determining their prices. In some cases, far cheaper generic agents appear to be available. A more robust, transparent and standard mechanism for determining annual costs is imperative.

Pub.: 26 Jun '15, Pinned: 27 Jan '17

Access to Orphan Drugs: A Comprehensive Review of Legislations, Regulations and Policies in 35 Countries.

Abstract: To review existing regulations and policies utilised by countries to enable patient access to orphan drugs.A review of the literature (1998 to 2014) was performed to identify relevant, peer-reviewed articles. Using content analysis, we synthesised regulations and policies for access to orphan drugs by type and by country.Fifty seven articles and 35 countries were included in this review. Six broad categories of regulation and policy instruments were identified: national orphan drug policies, orphan drug designation, marketing authorization, incentives, marketing exclusivity, and pricing and reimbursement. The availability of orphan drugs depends on individual country's legislation and regulations including national orphan drug policies, orphan drug designation, marketing authorization, marketing exclusivity and incentives such as tax credits to ensure research, development and marketing. The majority of countries (27/35) had in place orphan drug legislation. Access to orphan drugs depends on individual country's pricing and reimbursement policies, which varied widely between countries. High prices and insufficient evidence often limit orphan drugs from meeting the traditional health technology assessment criteria, especially cost-effectiveness, which may influence access.Overall many countries have implemented a combination of legislations, regulations and policies for orphan drugs in the last two decades. While these may enable the availability and access to orphan drugs, there are critical differences between countries in terms of range and types of legislations, regulations and policies implemented. Importantly, China and India, two of the largest countries by population size, both lack national legislation for orphan medicines and rare diseases, which could have substantial negative impacts on their patient populations with rare diseases.

Pub.: 10 Oct '15, Pinned: 27 Jan '17

COLD-PCR Technologies in the Area of Personalized Medicine: Methodology and Applications.

Abstract: Somatic mutations bear great promise for use as biomarkers for personalized medicine, but are often present only in low abundance in biological material and are therefore difficult to detect. Many assays for mutation analysis in cancer-related genes (hotspots) have been developed to improve diagnosis, prognosis, prediction of drug resistance, and monitoring of the response to treatment. Two major approaches have been developed: mutation-specific amplification methods and methods that enrich and detect mutations without prior knowledge on the exact location and identity of the mutation. CO-amplification at Lower Denaturation temperature Polymerase Chain Reaction (COLD-PCR) methods such as full-, fast-, ice- (improved and complete enrichment), enhanced-ice, and temperature-tolerant COLD-PCR make use of a critical temperature in the polymerase chain reaction to selectively denature wild-type-mutant heteroduplexes, allowing the enrichment of rare mutations. Mutations can subsequently be identified using a variety of laboratory technologies such as high-resolution melting, digital polymerase chain reaction, pyrosequencing, Sanger sequencing, or next-generation sequencing. COLD-PCR methods are sensitive, specific, and accurate if appropriately optimized and have a short time to results. A large variety of clinical samples (tumor DNA, circulating cell-free DNA, circulating cell-free fetal DNA, and circulating tumor cells) have been studied using COLD-PCR in many different applications including the detection of genetic changes in cancer and infectious diseases, non-invasive prenatal diagnosis, detection of microorganisms, or DNA methylation analysis. In this review, we describe in detail the different COLD-PCR approaches, highlighting their specificities, advantages, and inconveniences and demonstrating their use in different fields of biological and biomedical research.

Pub.: 20 Jan '17, Pinned: 27 Jan '17

Exploring gut microbes in human health and disease: Pushing the envelope.

Abstract: Humans have coevolved with their microbes over thousands of years, but this relationship, is now being dramatically affected by shifts in the collective human microbiome resulting from changes in the environment and societal norms. Resulting perturbations of intestinal host-microbe interactions can lead to miscues and altered host responses that increase the risk of pathogenic processes and promote "western" disorders such as inflammatory bowel diseases, cancers, obesity, diabetes, autism, and asthma. Given the current challenges and limitations in gene therapy, approaches that can reshape the gut microbiome represent a reasonable strategy for restoring the balance between host and microbes. In this review and commentary, we highlight recent progress in our understanding of the intestinal microbiome in the context of health and diseases, focusing on mechanistic concepts that underlie the complex relationships between host and microbes. Despite these gains, many challenges lie ahead that make it difficult to close the gap between the basic sciences and clinical application. We will discuss the potential therapeutic strategies that can be used to manipulate the gut microbiota, recognizing that the promise of pharmabiotics ("bugs to drugs") is unlikely to be completely fulfilled without a greater understanding of enteric microbiota and its impact on mammalian physiology. By leveraging the knowledge gained through these studies, we will be prepared to enter the era of personalized medicine where clinical inventions can be custom-tailored to individual patients to achieve better outcomes.

Pub.: 03 Feb '15, Pinned: 26 Jan '17

Mapping chemical structure-activity information of HAART-drug cocktails over complex networks of AIDS epidemiology and socioeconomic data of U.S. counties.

Abstract: Using computational algorithms to design tailored drug cocktails for highly active antiretroviral therapy (HAART) on specific populations is a goal of major importance for both pharmaceutical industry and public health policy institutions. New combinations of compounds need to be predicted in order to design HAART cocktails. On the one hand, there are the biomolecular factors related to the drugs in the cocktail (experimental measure, chemical structure, drug target, assay organisms, etc.); on the other hand, there are the socioeconomic factors of the specific population (income inequalities, employment levels, fiscal pressure, education, migration, population structure, etc.) to study the relationship between the socioeconomic status and the disease. In this context, machine learning algorithms, able to seek models for problems with multi-source data, have to be used. In this work, the first artificial neural network (ANN) model is proposed for the prediction of HAART cocktails, to halt AIDS on epidemic networks of U.S. counties using information indices that codify both biomolecular and several socioeconomic factors. The data was obtained from at least three major sources. The first dataset included assays of anti-HIV chemical compounds released to ChEMBL. The second dataset is the AIDSVu database of Emory University. AIDSVu compiled AIDS prevalence for >2300 U.S. counties. The third data set included socioeconomic data from the U.S. Census Bureau. Three scales or levels were employed to group the counties according to the location or population structure codes: state, rural urban continuum code (RUCC) and urban influence code (UIC). An analysis of >130,000 pairs (network links) was performed, corresponding to AIDS prevalence in 2310 counties in U.S. vs. drug cocktails made up of combinations of ChEMBL results for 21,582 unique drugs, 9 viral or human protein targets, 4856 protocols, and 10 possible experimental measures. The best model found with the original data was a linear neural network (LNN) with AUROC>0.80 and accuracy, specificity, and sensitivity≈77% in training and external validation series. The change of the spatial and population structure scale (State, UIC, or RUCC codes) does not affect the quality of the model. Unbalance was detected in all the models found comparing positive/negative cases and linear/non-linear model accuracy ratios. Using synthetic minority over-sampling technique (SMOTE), data pre-processing and machine-learning algorithms implemented into the WEKA software, more balanced models were found. In particular, a multilayer perceptron (MLP) with AUROC=97.4% and precision, recall, and F-measure >90% was found.

Pub.: 29 Apr '15, Pinned: 26 Jan '17

Association of Facebook Use With Compromised Well-Being: A Longitudinal Study.

Abstract: Face-to-face social interactions enhance well-being. With the ubiquity of social media, important questions have arisen about the impact of online social interactions. In the present study, we assessed the associations of both online and offline social networks with several subjective measures of well-being. We used 3 waves (2013, 2014, and 2015) of data from 5,208 subjects in the nationally representative Gallup Panel Social Network Study survey, including social network measures, in combination with objective measures of Facebook use. We investigated the associations of Facebook activity and real-world social network activity with self-reported physical health, self-reported mental health, self-reported life satisfaction, and body mass index. Our results showed that overall, the use of Facebook was negatively associated with well-being. For example, a 1-standard-deviation increase in "likes clicked" (clicking "like" on someone else's content), "links clicked" (clicking a link to another site or article), or "status updates" (updating one's own Facebook status) was associated with a decrease of 5%-8% of a standard deviation in self-reported mental health. These associations were robust to multivariate cross-sectional analyses, as well as to 2-wave prospective analyses. The negative associations of Facebook use were comparable to or greater in magnitude than the positive impact of offline interactions, which suggests a possible tradeoff between offline and online relationships.

Pub.: 18 Jan '17, Pinned: 26 Jan '17

Public preferences for electronic health data storage, access, and sharing - evidence from a pan-European survey.

Abstract: To assess the public's preferences regarding potential privacy threats from devices or services storing health-related personal data.A pan-European survey based on a stated-preference experiment for assessing preferences for electronic health data storage, access, and sharing.We obtained 20 882 survey responses (94 606 preferences) from 27 EU member countries. Respondents recognized the benefits of storing electronic health information, with 75.5%, 63.9%, and 58.9% agreeing that storage was important for improving treatment quality, preventing epidemics, and reducing delays, respectively. Concerns about different levels of access by third parties were expressed by 48.9% to 60.6% of respondents.On average, compared to devices or systems that only store basic health status information, respondents preferred devices that also store identification data (coefficient/relative preference 95% CI = 0.04 [0.00-0.08], P = 0.034) and information on lifelong health conditions (coefficient = 0.13 [0.08 to 0.18], P < 0.001), but there was no evidence of this for devices with information on sensitive health conditions such as mental and sexual health and addictions (coefficient = -0.03 [-0.09 to 0.02], P = 0.24). Respondents were averse to their immediate family (coefficient = -0.05 [-0.05 to -0.01], P = 0.011) and home care nurses (coefficient = -0.06 [-0.11 to -0.02], P = 0.004) viewing this data, and strongly averse to health insurance companies (coefficient = -0.43 [-0.52 to 0.34], P < 0.001), private sector pharmaceutical companies (coefficient = -0.82 [-0.99 to -0.64], P < 0.001), and academic researchers (coefficient = -0.53 [-0.66 to -0.40], P < 0.001) viewing the data.Storing more detailed electronic health data was generally preferred, but respondents were averse to wider access to and sharing of this information. When developing frameworks for the use of electronic health data, policy makers should consider approaches that both highlight the benefits to the individual and minimize the perception of privacy risks.

Pub.: 24 Apr '16, Pinned: 26 Jan '17

Change in fast walking speed preceding death: results from a prospective longitudinal cohort study.

Abstract: Walking speed (WS) predicts mortality. However, it is unclear if decline in WS increases prior to death. We examined whether (a) WS declined faster in persons who died during the follow-up compared with those who remained alive and (b) adding change in WS to a model including age, sex, and baseline WS improved prediction of mortality.Data are from 4,016 participants of the Dijon center of the Three-City study (France), aged 65-85 years. Fast WS (FWS) was measured up to five times over a 12-year period. Mortality was ascertained until 2012.Linear mixed models using a backward time scale showed that FWS declined faster in 908 participants who died during the follow-up (annual change = -0.031 m/s) than in those who survived (-0.021 m/s), corresponding to a difference of -0.009 (95% confidence interval = -0.013 to -0.005) m/s. Compared with "normal" change in FWS (annual change ≥-0.04 m/s), "substantial" decline (<-0.08 m/s) was associated with a 1.4-fold greater risk of mortality (hazards ratio = 1.40, confidence interval = 1.02-1.92) and small decline (-0.08 to -0.04 m/s) with a 1.2-fold greater risk (hazards ratio = 1.18, confidence interval = 0.89-1.57). The net reclassification index when adding these categories of change in FWS to the model adjusted for age, sex, and baseline FWS was 19.0% (0.6, 36.8%).Participants who died during the follow-up had a steeper decline in FWS than the others. Both baseline FWS and FWS decline predict mortality.

Pub.: 06 Aug '13, Pinned: 26 Jan '17

Effectiveness of activity trackers with and without incentives to increase physical activity (TRIPPA): a randomised controlled trial.

Abstract: Despite the increasing popularity of activity trackers, little evidence exists that they can improve health outcomes. We aimed to investigate whether use of activity trackers, alone or in combination with cash incentives or charitable donations, lead to increases in physical activity and improvements in health outcomes.In this randomised controlled trial, employees from 13 organisations in Singapore were randomly assigned (1:1:1:1) with a computer generated assignment schedule to control (no tracker or incentives), Fitbit Zip activity tracker, tracker plus charity incentives, or tracker plus cash incentives. Participants had to be English speaking, full-time employees, aged 21-65 years, able to walk at least ten steps continuously, and non-pregnant. Incentives were tied to weekly steps, and the primary outcome, moderate-to-vigorous physical activity (MVPA) bout min per week, was measured via a sealed accelerometer and assessed on an intention-to-treat basis at 6 months (end of intervention) and 12 months (after a 6 month post-intervention follow-up period). Other outcome measures included steps, participants meeting 70 000 steps per week target, and health-related outcomes including weight, blood pressure, and quality-of-life measures. This trial is registered at ClinicalTrials.gov, number NCT01855776.Between June 13, 2013, and Aug 15, 2014, 800 participants were recruited and randomly assigned to the control (n=201), Fitbit (n=203), charity (n=199), and cash (n=197) groups. At 6 months, compared with control, the cash group logged an additional 29 MVPA bout min per week (95% CI 10-47; p=0·0024) and the charity group an additional 21 MVPA bout min per week (2-39; p=0·0310); the difference between Fitbit only and control was not significant (16 MVPA bout min per week [-2 to 35; p=0·0854]). Increases in MVPA bout min per week in the cash and charity groups were not significantly greater than that of the Fitbit group. At 12 months, the Fitbit group logged an additional 37 MVPA bout min per week (19-56; p=0·0001) and the charity group an additional 32 MVPA bout min per week (12-51; p=0·0013) compared with control; the difference between cash and control was not significant (15 MVPA bout min per week [-5 to 34; p=0·1363]). A decrease in physical activity of -23 MVPA bout min per week (95% CI -42 to -4; p=0·0184) was seen when comparing the cash group with the Fitbit group. There were no improvements in any health outcomes (weight, blood pressure, etc) at either assessment.The cash incentive was most effective at increasing MVPA bout min per week at 6 months, but this effect was not sustained 6 months after the incentives were discontinued. At 12 months, the activity tracker with or without charity incentives were effective at stemming the reduction in MVPA bout min per week seen in the control group, but we identified no evidence of improvements in health outcomes, either with or without incentives, calling into question the value of these devices for health promotion. Although other incentive strategies might generate greater increases in step activity and improvements in health outcomes, incentives would probably need to be in place long term to avoid any potential decrease in physical activity resulting from discontinuation.Ministry of Health, Singapore.

Pub.: 09 Oct '16, Pinned: 28 Jan '17

Machine learning applications in cancer prognosis and prediction.

Abstract: Cancer has been characterized as a heterogeneous disease consisting of many different subtypes. The early diagnosis and prognosis of a cancer type have become a necessity in cancer research, as it can facilitate the subsequent clinical management of patients. The importance of classifying cancer patients into high or low risk groups has led many research teams, from the biomedical and the bioinformatics field, to study the application of machine learning (ML) methods. Therefore, these techniques have been utilized as an aim to model the progression and treatment of cancerous conditions. In addition, the ability of ML tools to detect key features from complex datasets reveals their importance. A variety of these techniques, including Artificial Neural Networks (ANNs), Bayesian Networks (BNs), Support Vector Machines (SVMs) and Decision Trees (DTs) have been widely applied in cancer research for the development of predictive models, resulting in effective and accurate decision making. Even though it is evident that the use of ML methods can improve our understanding of cancer progression, an appropriate level of validation is needed in order for these methods to be considered in the everyday clinical practice. In this work, we present a review of recent ML approaches employed in the modeling of cancer progression. The predictive models discussed here are based on various supervised ML techniques as well as on different input features and data samples. Given the growing trend on the application of ML methods in cancer research, we present here the most recent publications that employ these techniques as an aim to model cancer risk or patient outcomes.

Pub.: 10 Mar '15, Pinned: 26 Jan '17

Heart Failure: Diagnosis, Severity Estimation And Prediction Of Adverse Events Through Machine Learning Techniques

Abstract: Heart failure is a serious condition with high prevalence (about 2% in the adult population in developed countries, and more than 8% in patients older than 75 years). About 3–5% of hospital admissions are linked with heart failure incidents. Heart failure is the first cause of admission by healthcare professionals in their clinical practice. The costs are very high, reaching up to 2% of the total health costs in the developed countries. Building an effective disease management strategy requires analysis of large amount of data, early detection of the disease, assessment of the severity and early prediction of adverse events. This will inhibit the progression of the disease, will improve the quality of life of the patients and will reduce the associated medical costs. Towards this direction machine learning techniques have been employed. The aim of this paper is to present the state-of-the-art of the machine learning methodologies applied for the assessment of heart failure. More specifically, models predicting the presence, estimating the subtype, assessing the severity of heart failure and predicting the presence of adverse events, such as destabilizations, re-hospitalizations, and mortality are presented. According to the authors' knowledge, it is the first time that such a comprehensive review, focusing on all aspects of the management of heart failure, is presented.

Pub.: 17 Nov '16, Pinned: 26 Jan '17

Lung Cancer Risk Prediction Model Incorporating Lung Function: Development and Validation in the UK Biobank Prospective Cohort Study.

Abstract: Purpose Several lung cancer risk prediction models have been developed, but none to date have assessed the predictive ability of lung function in a population-based cohort. We sought to develop and internally validate a model incorporating lung function using data from the UK Biobank prospective cohort study. Methods This analysis included 502,321 participants without a previous diagnosis of lung cancer, predominantly between 40 and 70 years of age. We used flexible parametric survival models to estimate the 2-year probability of lung cancer, accounting for the competing risk of death. Models included predictors previously shown to be associated with lung cancer risk, including sex, variables related to smoking history and nicotine addiction, medical history, family history of lung cancer, and lung function (forced expiratory volume in 1 second [FEV1]). Results During accumulated follow-up of 1,469,518 person-years, there were 738 lung cancer diagnoses. A model incorporating all predictors had excellent discrimination (concordance (c)-statistic [95% CI] = 0.85 [0.82 to 0.87]). Internal validation suggested that the model will discriminate well when applied to new data (optimism-corrected c-statistic = 0.84). The full model, including FEV1, also had modestly superior discriminatory power than one that was designed solely on the basis of questionnaire variables (c-statistic = 0.84 [0.82 to 0.86]; optimism-corrected c-statistic = 0.83; pFEV1 = 3.4 × 10(-13)). The full model had better discrimination than standard lung cancer screening eligibility criteria (c-statistic = 0.66 [0.64 to 0.69]). Conclusion A risk prediction model that includes lung function has strong predictive ability, which could improve eligibility criteria for lung cancer screening programs.

Pub.: 18 Jan '17, Pinned: 28 Jan '17