Introduction: Studies regarding antibiotics administration during pregnancy and atopic dermatitis (AD) in children are only few. In this context, the objective of our study was to investigate the potential association between the timing of intrauterine exposure to antibiotics or prenatal antibiotic administration in general and AD occurrence in children.
Methods: This was a cross-sectional study in 1046 subjects. The exposure to antibiotics during pregnancy was initially evaluated using simple logistic regressions. Then, each period of antibiotics administration was adjusted with the other periods of antibiotics exposure (model 1) and with the other variables associated with AD in our database (model 2).
Results: In simple logistic regression analysis, the administration of antibiotics during pregnancy, as a whole period, presented a trend of association with AD (OR = 1.28, %CI: 0.99 – 1.65). When we analyzed antibiotic administration during each trimester of pregnancy, only antibiotherapy during the 3rd trimester was associated with AD (OR = 2.94, %CI: 1.21 – 7.12). After adjusting with all the other important risk factors associated with AD in the database, antibiotics administration during the 3rd trimester of pregnancy was still independently associated with AD (OR=2.64, %CI: 1.01 – 6.91).
Conclusion: Antibiotic administration during the 3rd trimester of pregnancy was independently associated with AD in children.
Sufficient caloric intake is important to maintain the balanced health status, especially during the period of aging, as aging and sickness share paths. Maintaining adequate nutritional balance is the best preventive measure to counteract the risk of malnutrition. There are several causes for malnutrition in elderly people, and some techniques like anthropometric measurements, laboratory and clinical parameters could help to diagnose malnutrition in these patients. The use of a simple validated questionnaire called the ‘Mini Nutritional Assessment’ measures the nutritional status of elderly patients. In this review, we discuss about the malnutrition in elderly people with and without a known cause and we present some of nutritional intervention. There are promising strategies that help overcoming malnutrition.
Introduction. Central obesity is characterized by the accumulation of abdominal fat which may lead to several diseases including insulin resistance. The prevalence of central obesity is higher in male and the incidence in young adult males is increased. Central obesity is also related to low testosterone levels. The research aimed to assess the relationship between the testosterone levels and insulin resistance of young adult males with central obesity.
Methods. This was a cross-sectional study, the subjects were young adult males of 18 to 25 years old. The central obesity consisted of 50 samples and non-central obesity comprised 90 samples. The examination of testosterone and insulin was performed by the ECLIA method, glucose used the enzymatic method, the insulin resistance was calculated by using the HOMA-IR index.
Results. The mean of the testosterone level in central obesity was lower than non-central obesity (5.24+1.17 vs 7.18+1.54 ng/mL, p<0.001). HOMA-IR index in central obesity was higher than non-central obesity (4.29 + 2.23 vs 2.46 + 1.72), p<0.001). Testosterone levels had negative correlation with HOMA-IR (r=-0.470, p<0.001). There was significant difference in HOMA-IR among the quartiles of testosterone levels.
Conclusion. There is negative correlation between testosterone level with HOMA-IR, the lower the testosterone level the higher the insulin resistance in young adult males.
In recent years there has been an increase in the incidence of acute pancreatitis worldwide. In spite of efforts to improve the treatment and care of patients with acute pancreatitis, to develop imaging investigations and interventional diagnostic and treatment techniques and to facilitate patients’ access to them, acute pancreatitis continues to be associated with significant mortality and morbidity, and the treatment of patients suffering from this disease entails significant costs for healthcare systems.
Researchers are in a permanent quest to get to a global consensus for stratyfying the severity of acute pancreatitis. We need this in order to offer the proper management for each patient diagnosed with this condition and to improve hospital and health system strategies. Over the years, it has been attempted to develop algorithms to support a swift assessment of patients with acute pancreatitis with a prediction of disease severity as close to reality as possible for optimal management. This has led to the development of Classifications of severity and severity scores. These require a permanent updating to keep up with the technical and technological developments involved in investigating and treating the patient and encompassing the most recent studies.
The goal of this paper is to go through these classifications and scores, emphasizing factors that should be taken into account, and reflect upon their utility and upon the necesity of improving them.
Background: Coronary artery disease (CAD) is the foremost cause of death in the most developed societies. Plaque formation in epicardial coronary arteries and ensuing inflammation are a known pathophysiologic factor of CAD.
Objectives: We aimed to separately and simultaneously evaluate the correlation between pericardial fat pad volume and overall peri-coronary epicardial adipose tissue (EAT) thickness with coronary calcium score (CCS) to improve risk stratification of CAD.
Methods: We retrospectively reviewed patients who underwent a non-invasive contrast-enhanced coronary multidetector CT (MDCT) angiography. Peri-coronary EAT thickness, pericardial fat pad volume and CCS were obtained by an expert radiologist from the patients coronary multidetector CT (MDCT) angiography.
Results: We included 141 symptomatic patients (86 men, 55 women) with an average age of 53.53 ± 12.92. An increment of overall peri-coronary EAT thickness (1/3 × (left anterior descending artery (LAD) + left circumflex artery (LCx) + right coronary artery (RCA)) was associated with a 49% increase in the odds for the presence of coronary artery calcification (CAC) (P = 0.004). Significant predictability of peri-coronary EAT-average was seen in diagnosing calcified plaque. Pericardial fat pad volume was positively correlated with overall peri-coronary EAT thickness in age and body mass index (BMI)-adjusted linear regression models, (P<0.001).
Conclusion: Our results amplify previous idea that peri-coronary EAT and pericardial fat pad volume might act as useful markers and better indicators of CCS based on Agatston score in comparison with BMI or body weight in order to revealing subsequent CADs.
Pharmacogenomics describes the link between the genetic code and variations in drug response or adverse effects. It is rapidly gaining in both interest and accessibility. The knowledge of the gene-drug pairing for a wide range of medications will allow the clinician to select drugs with the best efficacy, appropriate dose and lowest likelihood of serious side effects.
In order to apply this knowledge, practitioners need to be familiar with the basic principles of pharmacodynamics and pharmacokinetics, and how these relate to drug response. Once these are understood, so can be the genetic variations that lead to different phenotypes. Our review explains these concepts and uses examples of commonly prescribed medications and their gene pairings. At the present time, the Food and Drug Administration (FDA) guidelines remain sparse in regards to pharmacogenomic testing but, despite this, direct-to-consumer testing is widely available. In this context, we detail how to interpret a pharmacogenomic report, we review the indications for testing, as well as its limitations.
This information is a step ahead towards invidualized medicine, in the hope that tailoring medications and doses to an individual’s genetic make-up will predict a safe and effective response.
Background: Early intervention for septic shock is crucial to reduce mortality and improve outcome. There is still a great debate over the exact time of Therapeutic plasma exchange (TPE) administration in septic shock patients. This study aims to investigate the effect of early initiation (within 4 hours) of TPE in severe septic shock on hemodynamics & outcome.
Methods: We conducted a prospective, before-after case series study on 16 septic shock patients requiring high doses of vasopressors admitted in two ICUs from Cairo, Egypt. All of our patients received TPE within 4 hours of ICU admission. The fresh frozen plasma exchange volume = 1.5 x plasma volume.
Results: In the 16 patients included in the study, mean arterial pressure was significantly improved after the initial TPE (p>0.002) and Norepinephrine dose which significantly reduced post TPE (p<0.001).In addition, Norepinephrine dose to mean arterial pressure significantly improved (p<0.001). There was reduction of a net 6 hours fluid balances following the first TPE were observed in all the patients (p>0.03) by a mean of 757 ml. Systemic vascular resistance index was markedly improved post-TPE along with statistically improved cardiac index (p<0.01). Stroke volume variance was also significantly decreased after the TPE sessions (p<0.01). C-reactive protein significantly improved after TPE (P<0.01).
Conclusion: Early initiation of TPE in severe septic shock patients might improve hemodynamic measures.
Introduction. Management of upper gastrointestinal bleeding (UGIB) is of great importance. In this way, we aimed to evaluate the performance of three well known scoring systems of AIMS65, Glasgow-Blatchford Score (GBS) and Full Rockall Score (FRS) in predicting adverse outcomes in patients with UGIB as well as their ability in identifying low risk patients for outpatient management. We also aimed to assess whether changing albumin cutoff in AIMS65 and addition of albumin to GBS add predictive value to these scores.
Methods. This was a retrospective study on adult patients who were admitted to Razi hospital (Rasht, Iran) with diagnosis of upper gastrointestinal bleeding between March 21, 2013 and March 21, 2017. Patients who didn’t undergo endoscopy or had incomplete medical data were excluded. Initially, we calculated three score systems of AIMS65, GBS and FRS for each patient by using initial Vital signs and lab data. Secondary, we modified AIMS65 and GBS by changing albumin threshold from <3.5 to <3.0 in AIMS65 and addition of albumin to GBS, respectively. Primary outcomes were defined as in hospital mortality, 30-day rebleeding, need for blood transfusion and endoscopic therapy. Secondary outcome was defined as composition of primary outcomes excluding need for blood transfusion. We used AUROC to assess predictive accuracy of risk scores in primary and secondary outcomes. For albumin-GBS model, the AUROC was only calculated for predicting mortality and secondary outcome. The negative predictive value for AIMS65, GBS and modified AIMS65 was then calculated.
Result. Of 563 patients, 3% died in hospital, 69.4% needed blood transfusion, 13.1% needed endoscopic therapy and 3% had 30-day rebleeding. The leading cause of UGIB was erosive disease. In predicting composite of adverse outcomes all scores had statistically significant accuracy with highest AUROC for albumin-GBS. However, in predicting in hospital mortality, only albumin-GBS, modified AIMS65 and AIMS65 had acceptable accuracy. Interestingly, albumin, alone, had higher predictive accuracy than other original risk scores. None of the four scores could predict 30-day rebleeding accurately; on the contrary, their accuracy in predicting need for blood transfusion was high enough. The negative predictive value for GBS was 96.6% in score of ≤2 and 85.7% and 90.2% in score of zero in AIMS65 and modified AIMS65, respectively.
Conclusion. Neither of risk scores was highly accurate as a prognostic factor in our population; however, modified AIMS65 and albumin-GBS may be optimal choice in evaluating risk of mortality and general assessment. In identifying patient for safe discharge, GBS ≤ 2 seemed to be advisable choice.
Background. Over the past years, eosinophil infiltration involving the gastrointestinal tract and pancreas leading to eosinophilic pancreatitis, eosinophilic gastroenteritis and hypereosinophilic syndrome has been reported in the literature.
We aimed to analyze and compare the features involving patients with eosinophilic pancreatitis and pancreatitis associated with eosinophilic gastroenteritis and to determine if there is a connection between the two disorders or if they in fact meet the diagnostic criteria for hypereosinophilic syndrome.
Material and methods. The following search was performed in March 2019 on PubMed (MEDLINE) database using the medical terms “pancreatitis”, “eosinophilic pancreatitis”, “eosinophilic gastroenteritis” and “hypereosinophilic syndrome”.
Results. The search revealed 119 publications from 1970 onwards. A total of 83 papers were excluded, and the remaining 36 publications, consisting in case reports and case series, were analyzed. From 45 patients, 20 subjects with eosinophilic gastroenteritis developed pancreatitis, 20/45 had eosinophilic pancreatitis, and 5/45 hypereosinophilic syndrome involving the pancreas. There was no significant difference regarding clinical, laboratory and imaging features between the three groups, despite the multiple theories that explain the association of pancreatic and gastrointestinal eosinophilic infiltration. Although there was a strong resemblance between the three groups, histological evidence of eosinophilic gastrointestinal infiltration guided the treatment towards a less invasive way, while subjects with eosinophilic pancreatitis underwent pancreatic surgery to exclude potentially malignant lesions.
Conclusion. Although there are various theories that explain pancreatitis development in patients with eosinophilic gastroenteritis, hypereosinophilia diagnostic work-up should be taken into account in all patients with high number of blood eosinophils, even in those with eosinophilic pancreatitis in order to establish the diagnosis using a minimally invasive approach and to apply an adequate treatment.
Immune thrombocytopenia is an autoimmune hematological disorder characterized by severely decreased platelet count of peripheral cause: platelet destruction via antiplatelet antibodies which may also affect marrow megakaryocytes. Patients may present in critical situations, with cutaneous and/or mucous bleeding and possibly life-threatening organ hemorrhages (cerebral, digestive, etc.) Therefore, rapid diagnosis and therapeutic intervention are mandatory.
Corticotherapy represents the first treatment option, but as in any autoimmune disorder, there is a high risk of relapse. Second line therapy options include: intravenous immunoglobulins, thrombopoietin receptor agonists, rituximab or immunosuppression, but their benefit is usually temporary. Moreover, the disease generally affects young people who need repeated and prolonged treatment and hospitalization and therefore, it is preferred to choose a long term effect therapy. Splenectomy – removal of the site of platelet destruction – represents an effective and stable treatment, with 70–80% response rate and low complications incidence.
A challenging situation is the association of ITP with pregnancy, which further increases the risk due to the immunodeficiency of pregnancy, major dangers of bleeding, vital risks for mother and fetus, potential risks of medication, necessity of prompt intervention in the setting of specific obstetrical situations – delivery, pregnancy loss, obstetrical complications, etc.
We present an updated review of the current clinical and laboratory data, as well as a detailed analysis of the available therapeutic options with their benefits and risks, and also particular associations (pregnancy, relapsed and refractory disease, emergency treatment).