{ "items": [ "\n\n
AIMS: Hypertensive pregnancy is associated with increased risks of developing a range of vascular disorders in later life. Understanding when hypertensive target organ damage first emerges could guide optimal timing of preventive interventions. This review identifies evidence of hypertensive target organ damage across cardiac, vascular, cerebral, and renal systems at different time points from pregnancy to postpartum. METHODS AND RESULTS: Systematic review of Ovid/MEDLINE, EMBASE, and ClinicalTrials.gov up to and including February 2023 including review of reference lists. Identified articles underwent evaluation via a synthesis without meta-analysis using a vote-counting approach based on direction of effect, regardless of statistical significance. Risk of bias was assessed for each outcome domain, and only higher quality studies were used for final analysis. From 7644 articles, 76 studies, including data from 1 742 698 pregnancies, were identified of high quality that reported either blood pressure trajectories or target organ damage during or after a hypertensive pregnancy. Left ventricular hypertrophy, white matter lesions, proteinuria, and retinal microvasculature changes were first evident in women during a hypertensive pregnancy. Cardiac, cerebral, and retinal changes were also reported in studies performed during the early and late post-partum period despite reduction in blood pressure early postpartum. Cognitive dysfunction was first reported late postpartum. CONCLUSION: The majority of target organ damage reported during a hypertensive pregnancy remains evident throughout the early and late post-partum period despite variation in blood pressure. Early peri-partum strategies may be required to prevent or reverse target organ damage in women who have had a hypertensive pregnancy.
\n \n\n \n \nExtracellular vesicles (EVs), through their complex cargo, can reflect the state of their cell of origin and change the functions and phenotypes of other cells. These features indicate strong biomarker and therapeutic potential and have generated broad interest, as evidenced by the steady year-on-year increase in the numbers of scientific publications about EVs. Important advances have been made in EV metrology and in understanding and applying EV biology. However, hurdles remain to realising the potential of EVs in domains ranging from basic biology to clinical applications due to challenges in EV nomenclature, separation from non-vesicular extracellular particles, characterisation and functional studies. To address the challenges and opportunities in this rapidly evolving field, the International Society for Extracellular Vesicles (ISEV) updates its 'Minimal Information for Studies of Extracellular Vesicles', which was first published in 2014 and then in 2018 as MISEV2014 and MISEV2018, respectively. The goal of the current document, MISEV2023, is to provide researchers with an updated snapshot of available approaches and their advantages and limitations for production, separation and characterisation of EVs from multiple sources, including cell culture, body fluids and solid tissues. In addition to presenting the latest state of the art in basic principles of EV research, this document also covers advanced techniques and approaches that are currently expanding the boundaries of the field. MISEV2023 also includes new sections on EV release and uptake and a brief discussion of in vivo approaches to study EVs. Compiling feedback from ISEV expert task forces and more than 1000 researchers, this document conveys the current state of EV research to facilitate robust scientific discoveries and move the field forward even more rapidly.
\n \n\n \n \nBACKGROUND: Cholesterol-loading of mouse aortic vascular smooth muscle cells (mVSMCs) downregulates miR-143/145, a master regulator of the contractile state downstream of TGF\u03b2 signaling. In vitro, this results in transitioning from a contractile mVSMC to a macrophage-like state. This process likely occurs in vivo based on studies in mouse and human atherosclerotic plaques. OBJECTIVES: To test whether cholesterol-loading reduces VSMC TGF\u03b2 signaling and if cholesterol efflux will restore signaling and the contractile state in vitro and in vivo. METHODS: Human coronary artery (h)VSMCs were cholesterol-loaded, then treated with HDL (to promote cholesterol efflux). For in vivo studies, partial conditional deletion of Tgf\u03b2r2 in lineage-traced VSMC mice was induced. Mice wild-type for VSMC Tgf\u03b2r2 or partially deficient (Tgf\u03b2r2+/-) were made hypercholesterolemic to establish atherosclerosis. Mice were then treated with apoA1 (which forms HDL). RESULTS: Cholesterol-loading of hVSMCs downregulated TGF\u03b2 signaling and contractile gene expression; macrophage markers were induced. TGF\u03b2 signaling positively regulated miR-143/145 expression, increasing Acta2 expression and suppressing KLF4. Cholesterol-loading localized TGF\u03b2 receptors into lipid rafts, with consequent TGF\u03b2 signaling downregulation. Notably, in cholesterol-loaded hVSMCs HDL particles displaced receptors from lipid rafts and increased TGF\u03b2 signaling, resulting in enhanced miR-145 expression and decreased KLF4-dependent macrophage features. ApoA1 infusion into Tgf\u03b2r2+/- mice restored Acta2 expression and decreased macrophage-marker expression in plaque VSMCs, with evidence of increased TGF\u03b2 signaling. CONCLUSIONS: Cholesterol suppresses TGF\u03b2 signaling and the contractile state in hVSMC through partitioning of TGF\u03b2 receptors into lipid rafts. These changes can be reversed by promotion of cholesterol efflux, consistent with evidence in vivo.
\n \n\n \n \nPURPOSE: Fatigue is a common and debilitating problem in patients recovering from critical illness. To address a lack of evidence-based interventions for people with fatigue after critical illness, we co-produced a self-management intervention based on self-regulation theory. This article reports the development and initial user testing of the co-produced intervention. METHODS: We conducted three workshops with people experiencing fatigue after critical illness, family members, and healthcare professionals to develop a first draft of the FACT intervention, designed in web and electronic document formats. User testing and interviews were conducted with four people with fatigue after critical illness. Modifications were made based on the findings. RESULTS: Participants found FACT acceptable and easy to use, and the content provided useful strategies to manage fatigue. The final draft intervention includes four key topics: (1) about fatigue which discusses the common characteristics of fatigue after critical illness; (2) managing your energy with the 5 Ps (priorities, pacing, planning, permission, position); (3) strategies for everyday life (covering physical activity; home life; leisure and relationships; work, study, and finances; thoughts and feelings; sleep and eating); and (4) goal setting and making plans. All material is presented as written text, videos, and supplementary infographics. FACT includes calls with a facilitator but can also be used independently. CONCLUSIONS: FACT is a theory driven intervention co-produced by patient, carer and clinical stakeholders and is based on contemporary available evidence. Its development illustrates the benefits of stakeholder involvement to ensure interventions are informed by user needs. Further testing is needed to establish the feasibility and acceptability of FACT. IMPLICATIONS FOR CLINICAL PRACTICE: The FACT intervention shows promise as a self-management tool for people with fatigue after critical illness. It has the potential to provide education and strategies to patients at the point of discharge and follow-up.
\n \n\n \n \nPercutaneous left atrial appendage occlusion aims to reduce the risk of stroke in patients with AF, particularly those who are not good candidates for systemic anticoagulation. The procedure has been studied in large international randomised trials and registries and was approved by the National Institute for Health and Care Excellence in 2014 and by NHS England in 2018. This position statement summarises the evidence for left atrial appendage occlusion and presents the current indications. The options and consensus on best practice for pre-procedure planning, undertaking a safe and effective implant and appropriate post-procedure management and follow-up are described. Standards regarding procedure volume for implant centres and physicians, the role of multidisciplinary teams and audits are highlighted.
\n \n\n \n \nPercutaneous left atrial appendage occlusion aims to reduce the risk of stroke in patients with AF, particularly those who are not good candidates for systemic anticoagulation. The procedure has been studied in large international randomised trials and registries and was approved by the National Institute for Health and Care Excellence in 2014 and by NHS England in 2018. This position statement summarises the evidence for left atrial appendage occlusion and presents the current indications. The options and consensus on best practice for pre-procedure planning, undertaking a safe and effective implant and appropriate post-procedure management and follow-up are described. Standards regarding procedure volume for implant centres and physicians, the role of multidisciplinary teams and audits are highlighted.
\n \n\n \n \nPURPOSE: The delay alternating with nutation for tailored excitation (DANTE)-sampling perfection with application-optimized contrasts (SPACE) sequence facilitates 3D intracranial vessel wall imaging with simultaneous suppression of blood and CSF. However, the achieved image contrast depends closely on the selected sequence parameters, and the clinical use of the sequence is limited in vivo by observed signal variations in the vessel wall, CSF, and blood. This paper introduces a comprehensive DANTE-SPACE simulation framework, with the aim of providing a better understanding of the underlying contrast mechanisms and facilitating improved parameter selection and contrast optimization. METHODS: An extended phase graph formalism was developed for efficient spin ensemble simulation of the DANTE-SPACE sequence. Physiological processes such as pulsatile flow velocity variation, varying flow directions, intravoxel velocity variation, diffusion, and B 1 + $$ {\\mathrm{B}}_1^{+} $$ effects were included in the framework to represent the mechanisms behind the achieved signal levels accurately. RESULTS: Intravoxel velocity variation improved temporal stability and robustness against small velocity changes. Time-varying pulsatile velocity variation affected CSF simulations, introducing periods of near-zero velocity and partial rephasing. Inclusion of diffusion effects was found to substantially reduce the CSF signal. Blood flow trajectory variations had minor effects, but B 1 + $$ {\\mathrm{B}}_1^{+} $$ differences along the trajectory reduced DANTE efficiency in low- B 1 + $$ {\\mathrm{B}}_1^{+} $$ areas. Introducing low-velocity pulsatility of both CSF and vessel wall helped explain the in vivo observed signal heterogeneity in both tissue types. CONCLUSION: The presented simulation framework facilitates a more comprehensive optimization of DANTE-SPACE sequence parameters. Furthermore, the simulation framework helps to explain observed contrasts in acquired data.
\n \n\n \n \nBACKGROUND: Congenital adrenal hyperplasia (CAH) encompasses a rare group of autosomal recessive disorders, characterised by enzymatic defects in steroidogenesis. Heterogeneity in management practices has been observed internationally. The International Congenital Adrenal Hyperplasia registry (I-CAH, https://sdmregistries.org/) was established to enable insights into CAH management and outcomes, yet its global adoption by endocrine centres remains unclear. DESIGN: We sought (1) to assess current practices amongst clinicians managing patients with CAH in the United Kingdom and Ireland, with a focus on choice of glucocorticoid, monitoring practices and screening for associated co-morbidities, and (2) to assess use of the I-CAH registry. MEASUREMENTS: We designed and distributed an anonymised online survey disseminated to members of the Society for Endocrinology and Irish Endocrine Society to capture management practices in the care of patients with CAH. RESULTS: Marked variability was found in CAH management, with differences between general endocrinology and subspecialist settings, particularly in glucocorticoid use, biochemical monitoring and comorbidity screening, with significant disparities in reproductive health monitoring, notably in testicular adrenal rest tumours (TARTs) screening (p\u2009=\u2009.002), sperm banking (p\u2009=\u2009.0004) and partner testing for CAH (p\u2009
\n \n\n \n \nTime-of-day significantly influences the severity and incidence of stroke. Evidence has emerged not only for circadian governance over stroke risk factors, but also for important determinants of clinical outcome. In this review, we provide a comprehensive overview of the interplay between chronobiology and cerebrovascular disease. We discuss circadian regulation of pathophysiological mechanisms underlying stroke onset or tolerance as well as in vascular dementia. This includes cell death mechanisms, metabolism, mitochondrial function, and inflammation/immunity. Furthermore, we present clinical evidence supporting the link between disrupted circadian rhythms and increased susceptibility to stroke and dementia. We propose that circadian regulation of biochemical and physiological pathways in the brain increase susceptibility to damage after stroke in sleep and attenuate treatment effectiveness during the active phase. This review underscores the importance of considering circadian biology for understanding the pathology and treatment choice for stroke and vascular dementia and speculates that considering a patient's chronotype may be an important factor in developing precision treatment following stroke.
\n \n\n \n \nThe supply of blood components and products in sufficient quantities is key to any effective health care system. This report describes the challenges faced by the English blood service, NHS Blood and Transplant (NHSBT), towards the end of the COVID-19 pandemic, which in October 2022 led to an Amber Alert being declared to hospitals indicating an impending blood shortage. The impact on the hospital transfusion services and clinical users is explained. The actions taken by NHSBT to mitigate the blood supply challenges and ensure equity of transfusion support for hospitals in England including revisions to the national blood shortage plans are described. This report focuses on the collaboration and communication between NHSBT, NHS England (NHSE), Department of Health and Social Care (DHSC), National Blood Transfusion Committee (NBTC), National Transfusion Laboratory Managers Advisory Group for NBTC (NTLM), National Transfusion Practitioners Network, the medical Royal Colleges and clinical colleagues across the NHS.
\n \n\n \n \nIncreased iron loss may reduce the effectiveness of iron supplementation. The objective of this study was to determine if daily oral iron supplementation increases iron loss, measured using a stable isotope of iron (58 Fe). We enrolled and dewormed 24 iron-depleted Kenyan children, 24-27\u2009months of age, whose body iron was enriched and equilibrated with 58 Fe given at least 1\u2009year earlier. Over 3\u2009months of supplementation (6\u2009mg iron/kg body weight [BW]/day), mean (\u00b1SD) iron absorption was 1.10 (\u00b10.28)\u2009mg/day. During supplementation, 0.55 (\u00b10.36) mg iron/day was lost, equal to half of the amount of absorbed iron. Supplementation did not increase faecal haem/porphyrin or biomarkers of enterocyte damage and gut or systemic inflammation. Using individual patient data, we examined iron dose, absorption and loss among all available long-term iron isotopic studies of supplementation. Expressed in terms of body weight, daily iron loss was correlated significantly with iron absorption (Pearson's r\u2009=\u20090.66 [95% confidence interval 0.48-0.78]) but not with iron dose (r\u2009=\u20090.16 [95% CI -0.10-0.40]). The results of this study indicate that iron loss is increased with daily oral iron supplementation and may blunt the efficacy of iron supplements in children. This study was registered at ClinicalTrials.gov as NCT04721964.
\n \n\n \n \nPURPOSE: We examined iron absorption and its regulation during two common scenarios experienced by endurance athletes. Our aims were to: (i) compare the effects of preexercise versus postexercise iron intake on iron absorption; and (ii) compare the impact of training at altitude (1800 m) on iron absorption preexercise. METHODS: Male runners (n = 18) completed three exercise trials over a 5-wk period, each preceded by 24 h of standardized low-iron diets. First, athletes completed two 60-min treadmill running trials at 65% V\u0307O2max at near sea-level (580 m). In a randomized order, preexercise and postexercise test meals labeled with 4 mg of 57Fe or 58Fe were consumed 30 min before or 30 min after exercise. Then, the same exercise trial was performed after living and training at altitude (~1800 m) for 7 d, with the labeled test meal consumed 30 min preexercise. We collected venous blood samples preexercise and postexercise for markers of iron status and regulation, and 14 d later to measure erythrocyte isotope incorporation. RESULTS: No differences in fractional iron absorption were evident when test meals were consumed preexercise (7.3% [4.4, 12.1]) or postexercise (6.2% [3.1, 12.5]) (n = 18; P = 0.058). Iron absorption preexercise was greater at altitude (18.4% [10.6, 32.0]) than at near sea-level (n = 17; P < 0.001) and hepcidin concentrations at altitude were lower at rest and 3 h postexercise compared with near sea level (P < 0.001). CONCLUSIONS: In an acute setting, preexercise and postexercise iron absorption is comparable if consumed within 30 min of exercise. Preexercise iron absorption increases 2.6-fold at altitude compared with near sea-level, likely due to the homeostatic response to provide iron for enhanced erythropoiesis and maintain iron stores.
\n \n\n \n \nBACKGROUND: Iron fortificants tend to be poorly absorbed and may adversely affect the gut, especially in African children. OBJECTIVE: We assessed the effects of prebiotic galacto-oligosaccharides/fructo-oligosaccharides (GOS/FOS) on iron absorption and gut health when added to iron-fortified infant cereal. METHODS: We randomly assigned Kenyan infants (n = 191) to receive daily for 3 wk a cereal containing iron and 7.5 g GOS/FOS (7.5 g+iron group), 3 g (3-g+iron group) GOS/FOS, or no prebiotics (iron group). A subset of infants in the 2 prebiotic+iron groups (n = 66) consumed 4 stable iron isotope-labeled test meals without and with prebiotics, both before and after the intervention. Primary outcome was fractional iron absorption (FIA) from the cereal with or without prebiotics regardless of dose, before and after 3 wk of consumption. Secondary outcomes included fecal gut microbiota, iron and inflammation status, and effects of prebiotic dose. RESULTS: Median (25th-75th percentiles) FIAs from meals before intervention were as follows: 16.3% (8.0%-27.6%) without prebiotics compared with 20.5% (10.4%-33.4%) with prebiotics (Cohen d = 0.53; P < 0.001). FIA from the meal consumed without prebiotics after intervention was 22.9% (8.5%-32.4%), 41% higher than from the meal without prebiotics before intervention (Cohen d = 0.36; P = 0.002). FIA from the meal consumed with prebiotics after intervention was 26.0% (12.2%-36.1%), 60% higher than from the meal without prebiotics before intervention (Cohen d = 0.45; P = 0.007). After 3 wk, compared with the iron group, the following results were observed: 1) Lactobacillus sp. abundances were higher in both prebiotic+iron groups (P < 0.05); 2) Enterobacteriaceae sp. abundances (P = 0.022) and the sum of pathogens (P < 0.001) were lower in the 7.5-g+iron group; 3) the abundance of bacterial toxin-encoding genes was lower in the 3-g+iron group (false discovery rate < 0.05); 4) fecal pH (P < 0.001) and calprotectin (P = 0.033) were lower in the 7.5-g+iron group. CONCLUSIONS: Adding prebiotics to iron-fortified infant cereal increases iron absorption and reduces the adverse effects of iron on the gut microbiome and inflammation in Kenyan infants. This trial was registered at clinicaltrials.gov as NCT03894358.
\n \n\n \n \nPURPOSE: Depression is associated with low-grade systemic inflammation and impaired intestinal function, both of which may reduce dietary iron absorption. Low iron status has been associated with depression in adults and adolescents. In Swiss adolescents, we determined the associations between paediatric major depressive disorder (pMDD), inflammation, intestinal permeability and iron status. METHODS: This is a matched case-control study in 95 adolescents with diagnosed pMDD and 95 healthy controls aged 13-17\u00a0years. We assessed depression severity using the Children's Depression Rating Scale-Revised. We measured iron status (serum ferritin (SF) and soluble transferrin receptor (sTfR)), inflammation (C-reactive protein (CRP) and alpha-1-acid-glycoprotein (AGP)), and intestinal permeability (intestinal fatty acid binding protein (I-FABP)). We assessed history of ID diagnosis and treatment with a self-reported questionnaire. RESULTS: SF concentrations did not differ between adolescents with pMDD (median (IQR) SF: 31.2 (20.2, 57.0) \u03bcg/L) and controls (32.5 (22.6, 48.3) \u03bcg/L, p\u2009=\u20090.4). sTfR was lower among cases than controls (4.50 (4.00, 5.50) mg/L vs 5.20 (4.75, 6.10) mg/L, p\u2009
\n \n\n \n \n