{ "items": [ "\n\n
BACKGROUND: Observational studies suggest a link between n-3 polyunsaturated fatty acid (PUFA) intake, n-3 PUFA status, and depression in adults, but studies in adolescents are scarce. This study aimed to determine associations of n-3 PUFA status and intake with paediatric major depressive disorder (pMDD) in Swiss adolescents. METHODS: We conducted a matched case-control study in 95 adolescents diagnosed with pMDD and 95 healthy controls aged 13 to <18\u00a0years. We analysed red blood cell (RBC) fatty acid (FA) composition (% of total FA). n-3 PUFA intake was assessed using a focused food frequency questionnaire and depression severity was assessed by the Children's Depression Rating Scale-Revised (CDRS-R). RESULTS: Mean RBC eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) were lower in cases than controls (EPA: 0.41\u00a0\u00b1\u00a00.11 vs 0.46\u00a0\u00b1\u00a00.12, p\u00a0
\n \n\n \n \nOBJECTIVE: Registers of diagnoses and treatments exist in different forms in the European countries and are potential sources to answer important research questions. Prevalence and incidence of thyroid diseases are highly dependent on iodine intake and, thus, iodine deficiency disease prevention programs. We aimed to collect European register data on thyroid outcomes to compare the rates between countries/regions with different iodine status and prevention programs. DESIGN: Register-based cross-sectional study. METHODS: National register data on thyroid diagnoses and treatments were requested from 23 European countries/regions. The provided data were critically assessed for suitability for comparison between countries/regions. Sex- and age-standardized rates were calculated. RESULTS: Register data on \u22651 thyroid diagnoses or treatments were available from 22 countries/regions. After critical assessment, data on medication, surgery, and cancer were found suitable for comparison between 9, 10, and 13 countries/regions, respectively. Higher rates of antithyroid medication and thyroid surgery for benign disease and lower rates of thyroid hormone therapy were found for countries with iodine insufficiency before approx. 2001, and no relationship was observed with recent iodine intake or prevention programs. CONCLUSIONS: The collation of register data on thyroid outcomes from European countries is impeded by a high degree of heterogeneity in the availability and quality of data between countries. Nevertheless, a relationship between historic iodine intake and rates of treatments for hyper- and hypothyroid disorders is indicated. This study illustrates both the challenges and the potential for the application of register data of thyroid outcomes across Europe.
\n \n\n \n \nIron is arguably the most important nutrient in the ongoing battle between hosts and bacteria. Recently in Nature, a unique iron storage organelle, the ferrosome, was discovered in the human pathogen Clostridioides difficile.1 But what is the role of ferrosomes and how do they affect bacterial behavior and infection?
\n \n\n \n \nBACKGROUND: The combination of cultivation studies with molecular analysis approaches allows characterization of the complex human gut microbiota in depth. In vitro cultivation studies of infants living in rural sub-Saharan Africa are scarce. In this study, a batch cultivation protocol for Kenyan infant fecal microbiota was validated. METHODS: Fresh fecal samples were collected from 10 infants living in a rural area of Kenya. Samples were transported under protective conditions and subsequently prepared for inoculation within less than 30\u00a0h for batch cultivation. A diet-adapted cultivation medium was used that mimicked the daily intake of human milk and maize porridge in Kenyan infants during weaning. 16\u00a0S rRNA gene amplicon sequencing and HPLC analyses were performed to assess the composition and metabolic activity, respectively, of the fecal microbiota after 24\u00a0h of batch cultivation. RESULTS: High abundance of Bifidobacterium (53.4\u2009\u00b1\u200911.1%) and high proportions of acetate (56\u2009\u00b1\u200911% of total metabolites) and lactate (24\u2009\u00b1\u200922% of total metabolites) were detected in the Kenyan infant fecal microbiota. After cultivation started at an initial pH 7.6, the fraction of top bacterial genera (\u2265\u20091% abundant) shared between fermentation and fecal samples was high at 97\u2009\u00b1\u20095%. However, Escherichia-Shigella, Clostridium sensu stricto 1, Bacteroides and Enterococcus were enriched concomitant with decreased Bifidobacterium abundance. Decreasing the initial pH to 6.9 lead to higher abundance of Bifidobacterium after incubation and increased the compositional similarity of fermentation and fecal samples. Despite similar total metabolite production of all fecal microbiota after cultivation, inter-individual differences in metabolite profiles were apparent. CONCLUSIONS: Protected transport and batch cultivation in host and diet adapted conditions allowed regrowth of the top abundant genera and reproduction of the metabolic activity of fresh Kenyan infant fecal microbiota. The validated batch cultivation protocol can be used to study the composition and functional potential of Kenyan infant fecal microbiota in vitro.
\n \n\n \n \nBACKGROUND: Guidelines to treat iron deficiency recommend daily provision of oral iron, but this may decrease fractional iron absorption and increase side effects. Our objective was to compare consecutive-day versus alternate-day iron supplementation. METHODS: In a double-masked, randomized, placebo-controlled trial, young Swiss women (n\u00a0=\u00a0150; serum ferritin \u226430\u00a0\u03bcg/L) were assigned to: daily 100\u00a0mg iron for 90\u00a0d, followed by daily placebo for another 90\u00a0d (consecutive-day group) or the same daily dose of iron and placebo on alternate days for 180\u00a0d (alternate-day group). The study period was 24/11/2021-10/8/2022. Co-primary outcomes, at equal total iron doses, were serum ferritin and gastrointestinal side effects; secondary outcomes were iron deficiency and serum hepcidin. Compliance and side effects were recorded daily using a mobile application. Data were analysed using mixed models and longitudinal prevalence ratios (LPR). The trial was registered at ClinicalTrials.gov (NCT05105438). FINDINGS: 75 women were assigned to each group and included in the intention-to-treat analysis. Capsule adherence and side effect reporting was >97% in both groups. At equal total iron doses, comparing consecutive-day and alternate-day groups, median serum ferritin was 43.8\u00a0\u03bcg/L (31.7-58.2) versus 44.8\u00a0\u03bcg/L (33.8-53.6) (P\u00a0=\u00a00.98), the LPR for gastrointestinal side effects on days of iron intake was 1.56 (95% CI: 1.38, 1.77; P\u00a0
\n \n\n \n \nBACKGROUND: Zinc-biofortified potatoes have considerable potential to reduce zinc deficiency because of their low levels of phytate, an inhibitor of zinc absorption, and their high consumption, especially in the Andean region of Peru. OBJECTIVES: The purpose of this study was to measure fractional and total zinc absorption from a test meal of biofortified compared with regular potatoes. METHODS: We undertook a single-blinded randomized crossover study (using 67Zn and 70Zn stable isotopes) in which 37 women consumed 500-g biofortified or regular potatoes twice a day. Urine samples were collected to determine fractional and total zinc absorption. RESULTS: The zinc content of the biofortified potato and regular potato was 0.48 (standard deviation [SD]: 0.02) and 0.32 (SD: 0.03) mg/100 g fresh weight, respectively. Mean fractional zinc absorption (FZA) from the biofortified potatoes was lower than from the regular potatoes, 20.8% (SD: 5.4%) and 25.5% (SD: 7.0%), respectively (P < 0.01). However, total zinc absorbed was significantly higher (0.49; SD: 0.13 and 0.40; SD: 0.11 mg/500 g, P < 0.01, respectively). CONCLUSIONS: The results of this study demonstrate that biofortified potatoes provide more absorbable zinc than regular potatoes. Zinc-biofortified potatoes could contribute toward reducing zinc deficiency in populations where potatoes are a staple food. This trial was registered at clinicaltrials.gov as NCT05154500.
\n \n\n \n \nBACKGROUND: Agronomic zinc biofortification of wheat by foliar application increases wheat zinc content and total zinc absorption in humans. OBJECTIVES: To assess the effect of agronomically biofortified whole wheat flour (BFW) on plasma zinc (PZC) compared with a postharvest fortified wheat (PHFW) and unfortified control wheat (CW) when integrated in a midday school meal scheme. METHODS: We conducted a 20-wk double-blind intervention trial in children (4-12 y, n = 273) individually randomly assigned to 3 groups to receive a daily school lunch consisting of 3 chapattis prepared with the 3 different wheat flour types. Measurements of anthropometry, blood biochemistry, and leukocyte DNA strand breaks were conducted. We applied sparse serial sampling to monitor PZC over time, and analysis was performed using linear mixed-effects models. RESULTS: Mean zinc content in BFW, PHFW, and CW were 48.0, 45.1, and 21.2 ppm, respectively (P < 0.001). Mean (standard deviation) daily zinc intakes in the study intervention in BFW, PHFW, and CW groups were 4.4 (1.6), 5.9 (1.9) and 2.6 (0.6) mg Zn/d, respectively, with intake in groups PHFW and BFW differing from CW (P < 0.001) but no difference between BFW and PHFW. There were no time effect, group difference, or group \u00d7 time interaction in PZC. Prevalence of zinc deficiency decreased in the BFW (from 14.1%-11.2%), PHFW (from 8.9%-2.3%), and CW (9.8%-8.8%) groups, but there was no time \u00d7 treatment interaction in the prevalence of zinc deficiency (P = 0.191). Compliance with consuming the study school meals was associated with PZC (P = 0.006). DNA strand breaks were not significantly associated with PZC (n = 51; r = 0.004, P = 0.945). CONCLUSIONS: Consumption of either PHFW or BFW provided an additional \u223c1.8 to 3.3 mg Zn/d, but it did not affect PZC or zinc deficiency, growth, or DNA strand breaks. This trial was registered on clinicaltrials.gov as NCT02241330 and ctri.nic.in as CTRI/2015/06/005913.
\n \n\n \n \nThe objective of this paper is to review the global effort to eliminate iodine deficiency and its impact on public health. Iodine is an essential component of hormones produced by the thyroid gland. Iodine deficiency has multiple adverse effects in humans due to inadequate thyroid hormone production that are termed the iodine deficiency disorders. The major adverse effect is impaired cognition in children. The WHO's first estimate of the global prevalence of goitre in 1960 suggested that 20-60 % of the world's population was affected, with most of the burden in low- and middle-income countries. Iodine deficiency was identified as a key global risk factor for impaired child development where the need for intervention was urgent. This spurred a worldwide effort to eliminate iodine deficiency led by a coalition of international organisations working closely with national governments and the salt industry. In most countries, the best strategy to control iodine deficiency is carefully monitored iodisation of salt. The reach of current iodised salt programmes is remarkable: in 2018, 88 % of the global population used iodised salt. The number of countries with adequate iodine intake has nearly doubled over the past 20 years from 67 in 2003 to 118 in 2020. The resulting improvement in cognitive development and future earnings suggests a potential global economic benefit of nearly $33 billion. Iodine programmes are appealing for national governments because the health and economic consequences are high and can be easily averted by salt iodisation, a low-cost and sustainable intervention.
\n \n\n \n \nBACKGROUND: Iron programs in low- and middle-income countries often target infants and young children. Limited data from human infants and mouse models suggest that homeostatic control of iron absorption is incomplete in early infancy. Excess iron absorption during infancy may have detrimental effects. OBJECTIVES: Our aims were to 1) investigate determinants of iron absorption in infants aged 3-15 mo and assess whether regulation of iron absorption is fully mature during this period and 2) define the threshold ferritin and hepcidin concentrations in infancy that trigger upregulation of iron absorption. METHODS: We performed a pooled analysis of standardized, stable iron isotope absorption studies performed by our laboratory in infants and toddlers. We used generalized additive mixed modeling (GAMM) to examine relationships between ferritin, hepcidin, and fractional iron absorption (FIA). RESULTS: Kenyan and Thai infants aged 2.9-15.1 mo (n = 269) were included; 66.8% were iron deficient and 50.4% were anemic. In regression models, hepcidin, ferritin, and serum transferrin receptor were significant predictors of FIA, whereas C-reactive protein was not. In the model including hepcidin, hepcidin was the strongest predictor of FIA (\u03b2 = -0.435). In all models, interaction terms, including age, were not significant predictors of FIA or hepcidin. The fitted GAMM trend of ferritin versus FIA showed a significant negative slope until ferritin of 46.3 \u03bcg/L (95% CI: 42.1, 50.5 \u03bcg/L), which corresponded to an FIA decrease from 26.5% to 8.3%; above this ferritin value, FIA remained stable. The fitted GAMM trend of hepcidin versus FIA showed a significant negative slope until hepcidin of 3.15 nmol/L (95% CI: 2.67, 3.63 nmol/L), above which FIA remained stable. CONCLUSIONS: Our findings suggest that the regulatory pathways of iron absorption are intact in infancy. In infants, iron absorption begins to increase at threshold ferritin and hepcidin values of \u223c46 \u03bcg/L and \u223c3 nmol/L, respectively, similar to adult values.
\n \n\n \n \nHaemoglobin Bart's hydrops fetalis syndrome (BHFS) represents the most severe form of \u03b1-thalassaemia, arising from deletion of the duplicated \u03b1-globin genes from both alleles. The absence of \u03b1-globin leads to the formation of non-functional haemoglobin Bart's (\u03b34) or haemoglobin H (HbH: \u03b24) resulting in severe anaemia, tissue hypoxia, and, in some cases, variable congenital or neurocognitive abnormalities. BHFS is the most common cause of hydrops fetalis in Southeast Asia; however, owing to global migration, the burden of this condition is increasing worldwide. With the availability of intensive perinatal care and intrauterine transfusions, an increasing number of patients survive with this condition. The current approach to long-term management of survivors involves regular blood transfusions and iron chelation, a task made challenging by the need for intensified transfusions to suppress the production of non-functional HbH-containing erythrocytes. While our knowledge of outcomes of this condition is evolving, it seems, in comparison to individuals with transfusion-dependent \u03b2-thalassaemia, those with BHFS may face an elevated risk of complications arising from chronic anaemia and hypoxia, ongoing haemolysis, iron overload, and from their respective treatments. Although stem cell transplantation remains a viable option for a select few, it is not without potential side effects. Looking ahead, potential advancements in the form of genetic engineering and innovative therapeutic approaches, such as the reactivation of embryonic \u03b1-like globin gene expression, hold promise for furthering the treatment of this condition. Prevention remains a crucial aspect of care, particularly in areas with high prevalence or limited resources.
\n \n\n \n \nPsoriasis is a chronic, inflammatory skin disorder characterized by well-demarcated erythematous lesions with surface scaling. The disease is underpinned by a dysregulated immune response with a shift in the balance of neutrophils, lymphocytes and platelets. We sought to evaluate the novel systemic inflammatory markers, neutrophil-to-lymphocyte ratio (NLR) and platelet-to-lymphocyte ratio (PLR), as psoriatic indicators. Pubmed, Web of Science and Scopus were systematically searched for relevant studies. Twenty-four studies consisting of a total of 2,275 psoriatic patients (1,301 males and 974 females) and 2,334 healthy controls (1,401 males and 933 females) were identified for inclusion in the quantitative analysis. The NLR and PLR were found to be significantly increased in psoriatic patients [standardized mean difference (SMD)\u2009=\u20090.68, 95% CI 0.56-0.80, p\u2009
\n \n\n \n \nAIMS: Accurate staging of hypertension-related cardiac changes, before the development of significant left ventricular hypertrophy, could help guide early prevention advice. We evaluated whether a novel semi-supervised machine learning approach could generate a clinically meaningful summary score of cardiac remodelling in hypertension. METHODS AND RESULTS: A contrastive trajectories inference approach was applied to data collected from three UK studies of young adults. Low-dimensional variance was identified in 66 echocardiography variables from participants with hypertension (systolic \u2265160\u2005mmHg) relative to a normotensive group (systolic < 120\u2005mmHg) using a contrasted principal component analysis. A minimum spanning tree was constructed to derive a normalized score for each individual reflecting extent of cardiac remodelling between zero (health) and one (disease). Model stability and clinical interpretability were evaluated as well as modifiability in response to a 16-week exercise intervention. A total of 411 young adults (29 \u00b1 6 years) were included in the analysis, and, after contrastive dimensionality reduction, 21 variables characterized >80% of data variance. Repeated scores for an individual in cross-validation were stable (root mean squared deviation = 0.1 \u00b1 0.002) with good differentiation of normotensive and hypertensive individuals (area under the receiver operating characteristics 0.98). The derived score followed expected hypertension-related patterns in individual cardiac parameters at baseline and reduced after exercise, proportional to intervention compliance (P = 0.04) and improvement in ventilatory threshold (P = 0.01). CONCLUSION: A quantitative score that summarizes hypertension-related cardiac remodelling in young adults can be generated from a computational model. This score might allow more personalized early prevention advice, but further evaluation of clinical applicability is required.
\n \n\n \n \nHypertensive disorders of pregnancy (HDP) are associated with an increased risk of cardiovascular disorders, with recent evidence linking pre-eclampsia with vascular dementia. We examined associations of HDP with cognitive performance measured in midlife, in a prospective cohort study, the Avon Longitudinal Study of Parents and Children. Six cognitive function domains were measured 20 years after pregnancy at a mean age of 51 years. The cognition tests were repeated at clinics in the following two years. Cognitive function domains measured were immediate and delayed verbal episodic memory, working memory, processing speed, verbal intelligence, and verbal fluency. Exposures were pre-eclampsia, gestational hypertension (GH), and a combined category of any HDP, all compared to normotensive pregnancy. Of 3393 pregnancies included in the analysis, GH was experienced by 417 (12.3%) and pre-eclampsia by 57 (1.7%). GH was associated with lower verbal episodic memory, in the delayed logic memory test (-0.16 SDs; 95% CI -0.30, -0.03; p\u00a0=\u00a0.015) and there was weak evidence of an association with the immediate logic memory test (-0.13 SDs; -0.27, 0.001; p\u00a0=\u00a0.058). However, we did not see steeper declines by age for women with GH and there was no evidence of associations with other cognitive domains or for pre-eclampsia with any domains. Results were not substantially changed after controlling for midlife blood pressure. Our findings suggest that a history of GH is associated with slightly reduced episodic memory 20 years after pregnancy, but we found no evidence of a quicker age-related decline compared to women with normotensive pregnancies.
\n \n\n \n \nHypertension is a serious medical condition that affects over a billion people worldwide. The proper management of disease progression requires an extended knowledge of the overall functional and structural changes in the whole body in response to the hypertension. Here, we propose HyperScore, an integrative and unified measure of hypertension progression relative to multi-organ and multi-modality clinical measurements and based on a semi-supervised machine learning (ML) approach. We developed the measure based on a large participating cohort from the UK Biobank database (n=27,099) with over 500 imaging and clinical variables from multiple modalities. The semi-supervised approach was developed based on the contrastive trajectory inference mechanism to provide a score that reflects the proximity of a participant to the disease state (range: 0-1). Modelling revealed that majority of hypertensive participants had scores above 0.25, whereas normotensives had scores below this threshold. The sensitivity and specificity were above 89%, with an area under the receiver operating characteristics of 96.4%. The modelling showed a stable performance when evaluating hidden testing sets on a 10-fold cross-validation scheme with nearly 0.1 error. There was a strong association (r2>0.6) between HyperScore and organs' phenotypic patterns, especially for variables such as white matter hyperintensity and body mass index. This study is the first to potentiate ML-based modelling of hypertension progression from a multi-organ perspective, which could significantly aid in clinical decision making to save lives.
\n \n\n \n \n