Κυριακή 12 Μαΐου 2019

Der Orthopäde

Posttraumatische Wachstumsstörung nach kindlichen distalen Radiusfrakturen mit Entwicklung einer Ulnalängenvarianz

Zusammenfassung

Hintergrund

Frakturen des distalen Unterarms sind bei Kindern häufig. Während beim Erwachsenen mit großer Sorgfalt auf den Erhalt und die Wiederherstellung der anatomischen Handgelenkskonfiguration geachtet wird, tolerieren wir beim Heranwachsenden aufgrund des noch hohen Spontankorrekturpotenzials große Achsabweichungen und therapieren weitgehend konservativ. Im handchirurgischen Alltag sehen wir jedoch regelmäßig junge Erwachsene mit posttraumatischen Handgelenksbeschwerden.

Fragestellung

Heilen die kindlichen Unterarmfrakturen tatsächlich trotz Toleranz großer Achsabweichungen beschwerdefrei aus?

Material und Methode

Literaturrecherche, Diskussion und Einordnung der Ergebnisse. Fallbeispiel aus dem handchirurgischen Alltag.

Ergebnisse

Die Ulna-Plus-Varianz infolge einer posttraumatischen Radiusverkürzung ist die häufigste beschwerdehafte Langzeitfolge nach kindlicher distaler Unterarmfraktur. Bereits ab einer Varianz von >2 mm treten regelmäßig Schmerzen und Bewegungseinschränkung auf, wie auch in unserem Fallbeispiel.

Diskussion

Sorgsame Verlaufskontrollen nach kindlicher distaler Unterarmfraktur auch über die Konsolidierung der Fraktur hinaus sind zu empfehlen, insbesondere nach transepiphysärer radialer Osteosynthese mittels Kirschner-Draht oder Beteiligung der Wachstumsfuge. Als rekonstruktive Maßnahme sollte eine Ulnaverkürzungsosteotomie bei passender Beschwerdesymptomatik und Bildgebung bereits ab >2 mm Ulna-Plus-Varianz erwogen werden.



Der Schiefhals beim Kind

Zusammenfassung

Hintergrund

Der Schiefhals ist eine häufige Pathologie beim neugeborenen und älteren Kind. Die Differenzialdiagnosen unterscheiden sich stark in Schwere, der Möglichkeit von Spätfolgen und ihrer Behandlung.

Methodik

Der Artikel gibt einen Überblick über die Differenzialdiagnostik des Schiefhalses beim Kind, die aktuelle Literatur sowie einen Einblick in unseren diagnostischen und therapeutischen Algorithmus.

Ergebnisse

Man unterscheidet angeborene von erworbenen und schmerzhafte von nichtschmerzhaften Schiefhaltungen. In der Regel handelt es sich um einen kongenitalen muskulären Schiefhals, mit einer geschätzten Inzidenz von 0,3–1,9 %. Die wichtigste Differenzialdiagnose des angeborenen muskulären Schiefhalses ist das Klippel-Feil-Syndrom. Erworbene Schiefhaltungen haben zum Teil schwerer wiegende Ursachen und sollten immer abgeklärt werden.

Zusammenfassung

Die Kenntnis der möglichen Ursachen und ihrer Behandlung ist essenziell, um dem betroffenen Kind und seiner Familie adäquat helfen zu können und etwaige Spätfolgen zu verhindern.



Hüftdysplasie – Neues und Bewährtes

Zusammenfassung

Hintergrund

Die Hüftdysplasie ist eine der häufigsten Erkrankungen in der Kinderorthopädie. Die Behandlung hat sich mit Einführung der Sonographie wesentlich verändert. Die Diagnose und die Therapie haben sich in das frühe Säuglingsalter vorverlagert. Damit befinden wir uns seit mehr als 20 Jahren in einer Zeit, in der die Hüftdysplasiebehandlung durch die Sonographie bestimmt wird. Eine große Menge an neuen Publikationen ist hinzugekommen. Bildgebende Verfahren wie die Magnetresonanztomographie zeigen neue Aspekte auf. Mit der Arthroskopie ist ein neues operatives Verfahren hinzugekommen. Dennoch spielen viele – vor allem in der operativen Therapie – bewährte Techniken weiterhin eine große Rolle.

Aktuelle Therapie

In dieser Arbeit werden neue und bewährte diagnostische Verfahren sowie konservative und operative therapeutische Maßnahmen dargestellt. Dabei muss berücksichtigt werden, dass bei der Vielzahl der Literatur nicht alle Aspekte im Detail beleuchtet werden können. Die Arbeit orientiert sich vor allem an der durch ein Hüftscreening im deutschsprachigen Raum etablierten Behandlung. Aber internationale diagnostische und therapeutische Sichtweisen sollen einbezogen werden.



Endoprothetische Rekonstruktion nach interkalarer Resektion

Zusammenfassung

Hintergrund

Nach interkalarer Resektion diaphysär gelegener Knochentumoren stellt die Rekonstruktion durch eine Diaphysenprothese eine wertvolle Behandlungsoption dar.

Ziel der Arbeit

Dieser Artikel soll einen umfassenden Überblick über Indikation, Technik, verfügbare Implantate, Literaturergebnisse und Alternativverfahren zum alloplastischen Segmentersatz geben.

Material und Methodik

Es werden eigene Erfahrungen und Ergebnisse präsentiert sowie ein Literaturüberblick, der wesentliche Arbeiten zum Thema zusammenfasst.

Ergebnisse

In der Literatur werden 10-Jahres-Standzeiten interkalarer Endoprothesen zwischen 64 und 80 % angegeben. Ein direkter Vergleich der Ergebnisse verschiedener Publikationen ist aufgrund kleiner Fallzahlen, unterschiedlicher Implantate und Nachbeobachtungszeiträume sowie dem heterogenen Patientenkollektiv erschwert. Biologische Alternativen zur alloplastischen Rekonstruktion sind autologe Knochentransplantate, Kallusdistraktion/Segmenttransport, allogene Knochentransplantate und die Masquelet-Technik. Innovative Tissue-Engineering-Ansätze befinden sich noch in präklinischer Erprobung.

Diskussion

Die kurz- bis mittelfristigen Ergebnisse für Diaphysenprothesen nach interkalarer Resektion sind zufriedenstellend und denen biologischer Verfahren aufgrund der unmittelbaren Belastbarkeit überlegen. Aufgrund potenzieller Spätkomplikationen kommen sie jedoch bis dato überwiegend in Palliativsituationen und bei älteren Patienten zum Einsatz.



Klassifikation des Wachstumspotenzials und resultierende therapeutische Konsequenzen bei Wirbelsäulendeformitäten

Zusammenfassung

Hintergrund

Die adoleszente idiopathische Skoliose ist eine dreidimensionale Achsabweichung der Wirbelsäule mit einer Krümmung in der Frontalebene (Cobb-Winkel) von mehr als 10° ohne nachweisbare Ursache. Während den Phasen des schnellen Wachstums ist eine Verschlechterung der Skoliose im Sinne einer Zunahme des Cobb-Winkels sowie der rotatorischen Komponente wahrscheinlich. Entsprechend ist die Kenntnis der unterschiedlichen Phasen des menschlichen Wachstums für die Behandlung der adoleszenten idiopathischen Skoliose entscheidend.

Klassifikation

Es existieren eine Vielzahl an Klassifikationssystemen, die bei der Abschätzung des Wachstumspotenzials helfen. Im Folgenden sollen zum einen die gängigsten Klassifikationssysteme in Hinblick auf ihre flächendeckende Verfügbarkeit, Lernkurve sowie Genauigkeit in Bezug auf die Anwendung bei adoleszenten idiopathischen Skoliosen genauer betrachtet werden. Zum anderen soll, basierend auf den gemessenen Cobb-Winkeln sowie dem zu erwartenden Wachstumspotenzial, ein Therapiealgorithmus zur Behandlung der adoleszenten idiopathischen Skoliose vorgestellt werden.



Ätiologie und Bedeutung von Wachstumsstörungen der Wirbelsäule

Zusammenfassung

Hintergrund

Der Großteil der Wachstumsstörungen der Wirbelsäule ist erworben und deren Ätiologie noch immer unbekannt. Sowohl bei den Skoliosen als auch bei den sagittalen Profilstörungen kommen die idiopathischen Formen am häufigsten vor.

Ätiologie

Die Ätiologie ist multifaktoriell und neben genetischen, hormonellen und mechanischen Faktoren scheinen auch Stoffwechselkomponenten mitbeteiligt zu sein. Das Progressionsrisiko einer bestehenden Deformität ist während des pubertären Wachstumsschubs besonders hoch. Entsprechend sollten in dieser vulnerablen Phase regelmäßige klinisch-radiologische Kontrollen erfolgen. Neuerdings werden unter Berücksichtigung der in den letzten Jahren gewonnenen Erkenntnisse über den Zusammenhang von Wirbelsäulen- und Thoraxwachstum und der damit verbundenen Lungenreifung die Deformitäten weniger nach ihrer Ätiologie, als vielmehr nach dem Zeitpunkt der Diagnosestellung eingeteilt. Unter dem Begriff Early-Onset-Skoliosen werden dementsprechend alle Deformitäten der Wirbelsäule subsumiert, welche vor dem 10. Lebensjahr nachgewiesen werden.

Therapie

Bei Versagen der so lange wie möglich einzusetzenden konservativen Therapien sollten definitive Fusionen durch Verwendung wachstumslenkender operativer Techniken hinausgezögert und damit eine den Umständen entsprechend optimale Lungenfunktion angestrebt werden.



Minimal-invasive Dekompressionsverfahren der Spinalkanalstenose

Zusammenfassung

Hintergrund

Die lumbale Spinalkanalstenose ist eine häufige Erkrankung im höheren Lebensalter mit deutlicher Auswirkung auf die Lebensqualität betroffener Patienten. Initial kommen konservative Therapien zum Einsatz, sie führen jedoch nicht zu einer Behebung der pathologischen Veränderungen. Die operative Erweiterung des Spinalkanals ist zielführend.

Fragestellung

Minimalisierung operativer Zugangsstrategien bei effektiver Dekompression im Spinalkanal unter Vermeidung der Nachteile makrochirurgischer Operationstechniken, monolateraler paravertebraler Zugang für eine bilaterale intraspinale Dekompression, spezielle Operationstechniken.

Material und Methoden

Beschrieben werden minimal-invasive Dekompressionsverfahren unter Einsatz von Mikroskop und Endoskop. Dabei werden verschiedene operative Strategien in Abhängigkeit von dem Ausmaß (mono-, bi- und multisegmental) und der Lokalisation der Stenose (intraspinal zentral, Recessus lateralis, foraminal) vorgestellt.

Ergebnisse

Minimal-invasive mikroskopische und endoskopische Dekompressionsverfahren ermöglichen eine suffiziente Erweiterung des Spinalkanals. Nachteile makroskopischer Operationsmethoden (z. B. postoperative Instabilität) werden vermieden. Das Komplikationspotenzial ist teilweise ähnlich dem makroskopischer Eingriffe, jedoch in der Ausprägung deutlich reduziert. Das subjektive Outcome für die Patienten ist spürbar besser.

Schlussfolgerungen

Unter Hinweis auf moderne minimal-invasive Dekompressionsverfahren stellt die Operation der lumbalen Spinalkanalstenose eine sinnvolle und logische Behandlungsalternative dar, da nur operativ eine kausale Therapie der Pathologie möglich ist.



Endoprothetik an Hand und Handgelenk


Update Unfallchirurgie Hot Topic: Tumororthopädie


Daumensattelgelenkendoprothetik – eine kritische Beurteilung

Zusammenfassung

Die Daumensattelgelenkendoprothetik wird im Vergleich zur Trapezektomie in der Behandlung der Daumensattelgelenkarthrose immer noch weit seltener eingesetzt. Die ersten Langzeitergebnisse mit einem modernen Prothesendesign lassen jedoch weniger Komplikationen und Revisionen erwarten als mit älteren Prothesentypen. Damit rücken die Vorteile der Prothese im Sinne einer schnelleren Rehabilitation und Arbeitsfähigkeit mehr in den Vordergrund. Die höheren Kosten der Endoprothesenversorgung im Vergleich zur Trapezektomie könnten gesamtwirtschaftlich betrachtet durch den schnelleren Wiedereintritt in die Arbeit gerechtfertigt sein.

Inwiefern die neuesten Prothesenentwicklungen im Sinne von Duokopfprothesen und anatomischer Oberflächenprothesen die Ergebnisse weiter verbessern, und damit die Akzeptanz der Daumensattelgelenkendoprothetik erhöht wird, bleibt mit Spannung zu beobachten.



Pediatric Critical Care Medicine

Association of Organ Dysfunction Scores and Functional Outcomes Following Pediatric Critical Illness
Objectives: Short-term and long-term morbidity and mortality are common following pediatric critical illness. Severe organ dysfunction is associated with significant in-hospital mortality in critically ill children; however, the performance of pediatric organ dysfunction scores as predictors of functional outcomes after critical illness has not been previously assessed. Design: Secondary analysis of a prospective observational cohort. Setting: A multidisciplinary, tertiary, academic PICU. Patients: Patients less than or equal to 18 years old admitted between June 2012 and August 2012. Interventions: None. Measurements and Main Results: The maximum pediatric Sequential Organ Failure Assessment and Pediatric Logistic Organ Dysfunction-2 scores during admission were calculated. The Functional Status Scale score was obtained at baseline, 6 months and 3 years following discharge. New morbidity was defined as a change in Functional Status Scale greater than or equal to 3 points from baseline. The performance of organ dysfunction scores at discriminating new morbidity or mortality at 6 months and 3 years was measured using the area under the curve. Seventy-three patients met inclusion criteria. Fourteen percent had new morbidity or mortality at 6 months and 23% at 3 years. The performance of the maximum pediatric Sequential Organ Failure Assessment and Pediatric Logistic Organ Dysfunction-2 scores at discriminating new morbidity or mortality was excellent at 6 months (areas under the curves 0.9 and 0.88, respectively) and good at 3 years (0.82 and 0.79, respectively). Conclusions: Severity of organ dysfunction is associated with longitudinal change in functional status and short-term and long-term development of new morbidity and mortality. Maximum pediatric Sequential Organ Failure Assessment and Pediatric Logistic Organ Dysfunction-2 scores during critical illness have good to excellent performance at predicting new morbidity or mortality up to 3 years after critical illness. Use of these pediatric organ dysfunction scores may be helpful for prognostication of longitudinal functional outcomes in critically ill children. All authors conceptualized, designed, analyzed, drafted the article for important intellectual content, and collected the data. The authors have disclosed that they do not have any potential conflicts of interest. For information regarding this article, E-mail: travis.matics@advocatehealth.com ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

The Inadequate Oxygen Delivery Index and Low Cardiac Output Syndrome Score As predictors of Adverse Events Associated With Low Cardiac Output Syndrome Early After Cardiac Bypass
Objectives: To evaluate the effectiveness of two scoring systems, the inadequate oxygen delivery index, a risk analytics algorithm (Etiometry, Boston, MA) and the Low Cardiac Output Syndrome Score, in predicting adverse events recognized as indicative of low cardiac output syndrome within 72 hours of surgery. Design: A retrospective observational pair-matched study. Setting: Tertiary pediatric cardiac ICU. Patients: Children undergoing cardiac bypass for congenital heart defects. Cases experienced an adverse event linked to low cardiac output syndrome in the 72 hours following surgery (extracorporeal membrane oxygenation, renal replacement therapy, cardiopulmonary resuscitation, and necrotizing enterocolitis) and were matched with a control patient on criteria of procedure, diagnosis, and age who experienced no such event. Interventions: None. Measurements and Main Results: Of a total 536 bypass operations in the study period, 38 patients experienced one of the defined events. Twenty-eight cases were included in the study after removing patients who suffered an event after 72 hours or who had insufficient data. Clinical and laboratory data were collected to derive scores for the first 12 hours after surgery. The inadequate oxygen delivery index was calculated by Etiometry using vital signs and laboratory data. A modified Low Cardiac Output Syndrome Score was calculated from clinical and therapeutic markers. The mean inadequate oxygen delivery and modified Low Cardiac Output Syndrome Score were compared within each matched pair using the Wilcoxon signed-rank test. Inadequate oxygen delivery correctly differentiated adverse events in 13 of 28 matched pairs, with no evidence of inadequate oxygen delivery being higher in cases (p = 0.71). Modified Low Cardiac Output Syndrome Score correctly differentiated adverse events in 23 of 28 matched pairs, with strong evidence of a raised score in low cardiac output syndrome cases (p < 0.01). Conclusions: Although inadequate oxygen delivery is an Food and Drug Administration approved indicator of risk for low mixed venous oxygen saturation, early postoperative average values were not linked with medium-term adverse events. The indicators included in the modified Low Cardiac Output Syndrome Score had a much stronger association with the specified adverse events. This work was undertaken at Great Ormond Street Hospital/UCL Institute of Child Health, which received a proportion of funding from the Department of Health's National Institute of Health Research Biomedical Research Centre's funding scheme. Drs. Ray and Peters' institutions received funding from Great Ormond Street Hospital Children's Charity (GOSHCC). Dr. Peters received funding from Faron pharmaceuticals (advisory board) and Therakind. Drs. Peters and Brown received support for article research from GOSHCC. Dr. Brown received other support from GOSHCC PICU infrastructure grant supporting Libby Rogers. The remaining authors have disclosed that they do not have any potential conflicts of interest. For information regarding this article, E-mail: samiran.ray@gosh.nhs.uk ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

Decision-Making About Intracranial Pressure Monitor Placement in Children With Traumatic Brain Injury
Objectives: Little is known about how clinicians make the complex decision regarding whether to place an intracranial pressure monitor in children with traumatic brain injury. The objective of this study was to identify the decisional needs of multidisciplinary clinician stakeholders. Design: Semi-structured qualitative interviews with clinicians who regularly care for children with traumatic brain injury. Setting: One U.S. level I pediatric trauma center. Subjects: Twenty-eight clinicians including 17 ICU nurses, advanced practice providers, and physicians and 11 pediatric surgeons and neurosurgeons interviewed between August 2017 and February 2018. Interventions: None. Measurements and Main Results: Participants had a mean age of 43 years (range, 30–66 yr), mean experience of 10 years (range, 0–30 yr), were 46% female (13/28), and 96% white (27/28). A novel conceptual model emerged that related the difficulty of the decision about intracranial pressure monitor placement (y-axis) with the estimated outcome of the patient (x-axis). This model had a bimodal shape, with the most difficult decisions occurring for patients who 1) had a good opportunity for recovery but whose neurologic examination had not yet normalized or 2) had a low but uncertain likelihood of neurologically functional recovery. Emergent themes included gaps in medical knowledge and information available for decision-making, differences in perspective between clinical specialties, and ethical implications of decision-making about intracranial pressure monitoring. Experienced clinicians described less difficulty with decision-making overall. Conclusions: Children with severe traumatic brain injury near perceived transition points along a spectrum of potential for recovery present challenges for decision-making about intracranial pressure monitor placement. Clinician experience and specialty discipline further influence decision-making. These findings will contribute to the design of a multidisciplinary clinical decision support tool for intracranial pressure monitor placement in children with traumatic brain injury. Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's website (http://journals.lww.com/pccmjournal). Dr. Bennett's institution also received funding from the National Institutes of Health (NIH) Eunice Kennedy Shriver National Institute of Child Health and Human Development and NIH/National Center for Advancing Translational Science. Drs. Bennett's and Rutebemberwa's institutions received funding from Mindsource Brain Injury Network of the Colorado Department of Human Services. Ms. Marsh's and Dr. Maertens's institutions received funding from the Colorado Department of Human Services. Dr. Hankinson's institution received funding from Colorado Traumatic Brain Injury Trust Fund. The remaining authors have disclosed that they do not have any potential conflicts of interest. For information regarding this article, E-mail: tell.bennett@ucdenver.edu ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

Development of an Antibiotic Guideline for Children With Suspected Ventilator-Associated Infections
Objectives: To develop a guideline for the decision to continue or stop antibiotics at 48–72 hours after their initiation in children with suspected ventilator-associated infection. Design: Prospective, multicenter observational data collection and subsequent development of an antibiotic guideline. Setting: Twenty-two PICUs. Patients: Children less than 3 years old receiving mechanical ventilation who underwent clinical testing and initiation of antibiotics for suspected ventilator-associated infection. Interventions: None. Measurements and Main Results: Phase 1 was a prospective data collection in 281 invasively ventilated children with suspected ventilator-associated infection. The median age was 8 months (interquartile range, 4–16 mo) and 75% had at least one comorbidity. Phase 2 was development of the guideline scoring system by an expert panel employing consensus conferences, literature search, discussions with institutional colleagues, and refinement using phase 1 data. Guideline scores were then applied retrospectively to the phase 1 data. Higher scores correlated with duration of antibiotics (p < 0.001) and higher PEdiatric Logistic Organ Dysfunction 2 scores (p < 0.001) but not mortality, PICU-free days or ventilator-free days. Considering safety and outcomes based on the phase 1 data and aiming for a 25% reduction in antibiotic use, the panel recommended stopping antibiotics at 48–72 hours for guideline scores less than or equal to 2, continuing antibiotics for scores greater than or equal to 6, and offered no recommendation for scores 3, 4, and 5. The acceptability and effect of these recommendations on antibiotic use and outcomes will be prospectively tested in phase 3 of the study. Conclusions: We developed a scoring system with recommendations to guide the decision to stop or continue antibiotics at 48–72 hours in children with suspected ventilator-associated infection. The safety and efficacy of the recommendations will be prospectively tested in the planned phase 3 of the study. Members of the Pediatric Acute Lung Injury and Sepsis Investigator (PALISI) Network are listed in Appendix 1. Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's website (http://journals.lww.com/pccmjournal). This study was supported, in part, by the Gerber Grant (number 4156) as well as the Clinical and Translational Science Awards number UL1TR000058 from the National Center for Advancing Translational Sciences (for access to Research Electronic Data Capture). Dr. Shein received funding from Accelerate Diagnostics. Drs. Karam's, Beardsley's, Karsies's, Prentice's, and Willson's institutions received funding from Gerber Foundation. Drs. Prentice and Willson received support for article research from Gerber Foundation. Dr. Tarquinio disclosed that she does not have any potential conflicts of interest. Address requests for reprints to: Steven L. Shein, MD, Division of Pediatric Critical Care, Rainbow Babies and Children's Hospital, 11100 Euclid Avenue, Cleveland OH, 44106. E-mail: Steven.shein@uhhospitals.org ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

Informed Consent for Bedside Procedures in Pediatric and Neonatal ICUs: A Nationwide Survey
Objectives: Primary objectives were to discover current practices of informed consent for bedside procedures in the PICU and neonatal ICU and how trainees learn to obtain consent. We also attempted to gauge if program directors felt that one method of consent was subjectively superior to another in the way it fulfilled established ethical criteria for informed consent. Design: An online anonymous survey. Participants were asked about how and by whom informed consent is currently obtained, training practices for fellows, and attitudes about how different consent methods fulfill ethical criteria. Setting: All U.S. fellowship programs for neonatology (n = 98) and pediatric critical care (n = 66) in the fall of 2017. Subjects: Neonatal and pediatric critical care fellowship program directors. Interventions: None. Measurements and Main Results: The overall response rate was 50% (82 of 164). The most common method for obtaining consent in both ICU types was via a written, separate (procedure-specific) consent (63% neonatal ICUs, 83% PICUs); least common was verbal consent (8% neonatal ICUs and 6% PICUs). Fellows were reported as obtaining consent most often (91%), followed by mid-level practitioners (71%) and residents (66%). Residents were one-fifth as likely to obtain consent in the PICU as compared with the neonatal ICU. Sixty-three percent of fellowship directors rated their programs as "strong" or "very strong" in preparing trainees to obtain informed consent. Twenty-eight percent of fellowship directors reported no formal training on how to obtain informed consent. Conclusions: Most respondents' ICUs use separate procedure-specific written consents for common bedside procedures, although considerable variability exists. Trainees reportedly most often obtain informed consent for procedures. Although most fellowship directors report their program as strong in preparing trainees to obtain consent, this study reveals areas warranting improvement. Dr. Feltman disclosed that an internal grant supports the use of Research Electronic Data Capture (REDCap). Dr. Arnolds has disclosed that she does not have any potential conflicts of interest. This study was performed at Evanston Hospital, Evanston, IL. For information regarding this article, E-mail: marnolds@northshore.org ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

Role of IV Immunoglobulin in Indian Children With Guillain-Barré Syndrome
Objectives: To evaluate the outcome of Indian children with Guillain-Barré syndrome who received IV immunoglobulin compared with those who did not receive any specific therapy. Design: Single center, prospective cross-sectional study. Setting: Tertiary care neurology teaching hospital. Patients: Children (≤ 18 yr old) with Guillain-Barré syndrome were included from a prospectively maintained Guillain-Barré syndrome registry from January 2008 to April 2017. Children were classified into acute inflammatory demyelinating polyradiculoneuropathy, acute motor axonal neuropathy, acute motor-sensory axonal neuropathy, and inexcitable motor nerves based on nerve conduction study. Interventions: Out of 138 pediatric Guillain-Barré syndrome, 50 received IV immunoglobulin and another 50 age and peak disability matched controls (who did not receive IV immunoglobulin or plasmapheresis) were selected from the same registry for comparison. Measurements and Main Results: Outcome at 3 and 6 months was defined on the basis of a 0–10 Clinical Grading Scale into complete (Clinical Grading Scale < 3), partial (Clinical Grading Scale 3–5), and poor (Clinical Grading Scale > 5) recovery. The primary outcome was proportion of patients with complete recovery at 3 and 6 months in IV immunoglobulin and non-IV immunoglobulin groups. Secondary outcomes included in-hospital deaths, duration of mechanical ventilation, and hospital stay. Subgroup analysis was done in acute motor axonal neuropathy and acute inflammatory demyelinating polyradiculoneuropathy groups. The baseline characteristics were similar except for shorter duration of illness and higher proportion of facial palsy in IV immunoglobulin group. Hospital deaths, duration of mechanical ventilation, hospital stay, and outcome at 3 and 6 months were not different between the two groups. Children with acute motor axonal neuropathy had better recovery at 6 months on IV immunoglobulin (58.3% vs 11.1%; p = 0.03), but not those with acute inflammatory demyelinating polyradiculoneuropathy (58.3% vs 72.2%; p = 0.22). In nonambulatory Guillain-Barré syndrome children, complete recovery at 6 months was similar in IV immunoglobulin and non-IV immunoglobulin group (57.4% vs 57.1%; p = 0.98). Conclusions: In Indian children with Guillain-Barré syndrome, the outcome at 6 months in IV immunoglobulin treated group was similar to non-IV immunoglobulin group. Children with acute motor axonal neuropathy responded better to IV immunoglobulin. Dr. Kalita was involved in study supervision, statistical analysis, data interpretation, and writing the article. She had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Dr. Kumar was involved in the data collection, follow-up, statistical analysis, literature search, construction of figures and tables, and writing the article. Dr. Misra was involved in planning, project supervision, and writing the article. Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's website (http://journals.lww.com/pccmjournal). This study was approved by Institutional Ethics Committee, Sanjay Gandhi Post Graduate Institute of Medical Sciences, Lucknow, India. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. The authors have disclosed that they do not have any potential conflicts of interest. For information regarding this article, E-mail: jayanteek@yahoo.com; jkalita@sgpgi.ac.in ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

Relationship Between Time to Left Atrial Decompression and Outcomes in Patients Receiving Venoarterial Extracorporeal Membrane Oxygenation Support: A Multicenter Pediatric Interventional Cardiology Early-Career Society Study
Objectives: To assess the variation in timing of left atrial decompression and its association with clinical outcomes in pediatric patients supported with venoarterial extracorporeal membrane oxygenation across a multicenter cohort. Design: Multicenter retrospective study. Setting: Eleven pediatric hospitals within the United States. Patients: Patients less than 18 years on venoarterial extracorporeal membrane oxygenation who underwent left atrial decompression from 2004 to 2016. Interventions: None. Measurements and Main Results: A total of 137 patients (median age, 4.7 yr) were included. Cardiomyopathy was the most common diagnosis (47%). Cardiac arrest (39%) and low cardiac output (50%) were the most common extracorporeal membrane oxygenation indications. Median time to left atrial decompression was 6.2 hours (interquartile range, 3.8–17.2 hr) with the optimal cut-point of greater than or equal to 18 hours for late decompression determined by receiver operating characteristic curve. In univariate analysis, late decompression was associated with longer extracorporeal membrane oxygenation duration (median 8.5 vs 5 d; p = 0.02). In multivariable analysis taking into account clinical confounder and center effects, late decompression remained significantly associated with prolonged extracorporeal membrane oxygenation duration (adjusted odds ratio, 4.4; p = 0.002). Late decompression was also associated with longer duration of mechanical ventilation (adjusted odds ratio, 4.8; p = 0.002). Timing of decompression was not associated with in-hospital survival (p = 0.36) or overall survival (p = 0.42) with median follow-up of 3.2 years. Conclusions: In this multicenter study of pediatric patients receiving venoarterial extracorporeal membrane oxygenation, late left atrial decompression (≥ 18 hr) was associated with longer duration of extracorporeal membrane oxygenation support and mechanical ventilation. Although no survival benefit was demonstrated, the known morbidities associated with prolonged extracorporeal membrane oxygenation use may justify a recommendation for early left atrial decompression. Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's website (http://journals.lww.com/pccmjournal). Supported, in part, by the CHAMPS for Mott Award, an institutional grant from the University of Michigan. Dr. Zampi's institution received funding from University of Michigan Department of Pediatrics (internal grant) and Siemens. Dr. Thiagarajan's institution received funding from Bristol Myers Squibb and Pfizer. Dr. Goldstein received funding from St. Jude Medical, Medtronic, Edwards Lifesciences, and W.L. Gore & Associates. The remaining authors have disclosed that they do not have any potential conflicts of interest. For information regarding this article, E-mail: jzampi@med.umich.edu ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

Does an Antimicrobial Time-Out Impact the Duration of Therapy of Antimicrobials in the PICU?
Objectives: Our aim was to perform an antimicrobial time-out 48–72 hours after commencing therapy in order to achieve a decrease in days of therapy per 1,000 patient days for vancomycin, meropenem, and piperacillin/tazobactam in all PICU patients during an 8-month period. Design: This is a pre- and postimplementation quality improvement study. Settings: A 30-bed PICU at a tertiary children's hospital. Patients: Patients less than 21 years old admitted to the PICU from July 1, 2015, until March 31, 2016, or from July 1, 2016, until March 31, 2017, who received antibiotics for greater than 48 hours were eligible for inclusion. Intervention: An antimicrobial time-out was performed after 48–72 hours of antimicrobials for all patients in the PICU during postimplementation. Measurements and Main Results: The primary outcome measure was days of therapy per 1,000 patient-days for three target antibiotics: vancomycin, meropenem, and piperacillin/tazobactam. Ninety-five patients meeting inclusion criteria were admitted to the PICU during the pre–time-out period and 95 patients during the post–time-out period. The cohort that underwent time-outs had lower days of therapy for vancomycin (81.3 vs 138.1; p = 0.037) and meropenem (34.7 vs 67.1; p = 0.045). Total acquisition cost was 31 % lower for piperacillin/tazobactam and vancomycin and 46% for meropenem post implementation. Time-outs led to antimicrobial duration being defined 63% of the time and deescalation or discontinuation of antimicrobials 29% of the time. Conclusions: A 48–72-hour time-out process in rounds is associated with a reduction in days of therapy for antibiotics commonly used in the PICU and may lead to more appropriate usage. The time-outs are associated with discontinuation, deescalation, or duration being defined, which are key elements of Centers for Disease Control and Prevention–recommended antimicrobial stewardship programs. Dr. Morphew's institution received funding from Memorial Health Services (ongoing consultancy agreement with Morphew Consulting, LLC). Dr. Babbitt received funding from the Memorial Medical Foundation. The remaining authors have disclosed that they do not have any potential conflicts of interest. Dr. Morphew's institution received funding from Memorial Health Services (ongoing consultancy agreement with Morphew Consulting, LLC). Dr. Babbitt received funding from the Memorial Medical Foundation. The remaining authors have disclosed that they do not have any potential conflicts of interest. Address requests for reprints to: Christopher J. Babbitt, MD, FCCP, Pediatric Critical Care, Miller Children's and Women's Hospital of Long Beach, Long Beach, CA. E-mail: cbabbitt@memorialcare.org ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

Clinical Profile and Predictors of Outcome of Pediatric Acute Respiratory Distress Syndrome in a PICU: A Prospective Observational Study
Objectives: To study the clinical profile, predictors of mortality, and outcomes of pediatric acute respiratory distress syndrome. Design: A prospective observational study. Setting: PICU, Advanced Pediatric Centre, Postgraduate Institute of Medical Education and Research, Chandigarh, India. Patients: All children (age > 1 mo to < 14 yr) admitted in PICU with a diagnosis of pediatric acute respiratory distress syndrome (as per Pediatric Acute Lung Injury Consensus Conference definition) from August 1, 2015, to November 2016. Interventions: None. Measurements and Main Results: Out of 1,215 children admitted to PICU, 124 (11.4%) had pediatric acute respiratory distress syndrome. Fifty-six children (45.2%) died. Median age was 2.75 years (1.0–6.0 yr) and 66.9% were male. Most common primary etiologies were pneumonia, severe sepsis, and scrub typhus. Ninety-seven children (78.2%) were invasively ventilated. On multiple logistic regressions, Lung Injury Score (p = 0.004), pneumothorax (p = 0.012), acute kidney injury at enrollment (p = 0.033), FIO2-D1 (p = 0.018), and PaO2/FIO2 ratio-D7 (p = 0.020) were independent predictors of mortality. Positive fluid balance (a cut-off value > 102.5 mL/kg; p = 0.016) was associated with higher mortality at 48 hours. Noninvasive oxygenation variables like oxygenation saturation index and saturation-FIO2 ratio were comparable to previously used invasive variables (oxygenation index and PaO2/FIO2 ratio) in monitoring the course of pediatric acute respiratory distress syndrome. Conclusions: Pediatric acute respiratory distress syndrome contributes to a significant burden in the PICU of a developing country and is associated with significantly higher mortality. Infection remains the most common etiology. Higher severity of illness scores at admission, development of pneumothorax, and a positive fluid balance at 48 hours predicted poor outcome. This work was performed at the Department of Pediatrics, Advanced Pediatrics Centre, Postgraduate Institute of Medical Education and Research (PGIMER), Chandigarh, India The authors have disclosed that they do not have any potential conflicts of interest. Address requests for reprints to: Arun Bansal, MD, FCCM, FRCPCH, Department of Pediatrics, Advanced Pediatrics Centre, Postgraduate Institute of Medical Education and Research (PGIMER), Chandigarh, India 160012. E-mail: drarunbansal@gmail.com ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

Gastric Residual Volume Measurement in U.K. PICUs: A Survey of Practice
Objectives: Despite little evidence, the practice of routine measurement of gastric residual volume to guide both the initiation and delivery of enteral feeding in PICUs is widespread internationally. In light of increased scrutiny of the evidence surrounding this practice, and as part of a trial feasibility study, we aimed to determine enteral feeding and gastric residual volume measurement practices in U.K. PICUs. Design: An online survey to 27 U.K. PICUs. Setting: U.K. PICUs. Subjects: A clinical nurse, senior doctor, and dietician were invited to collaboratively complete one survey per PICU and send a copy of their unit guidelines on enteral feeding and gastric residual volume. Interventions: None. Measurement and Main Results: Twenty-four of 27 units (89%) approached completed the survey. Twenty-three units (95.8%; 23/24) had written feeding guidelines, and 19 units (19/23; 83%) sent their guidelines for review. More units fed continuously (15/24; 62%) than intermittently (9/24; 37%) via the gastric route as their primary feeding method. All but one PICU routinely measured gastric residual volume, regardless of the method of feeding. Eighteen units had an agreed definition of feed tolerance, and all these included gastric residual volume. Gastric residual volume thresholds for feed tolerance were either volume based (mL/kg body weight) (11/21; 52%) or a percentage of the volume of feed administered (6/21; 29%). Yet only a third of units provided guidance about the technique of gastric residual volume measurement. Conclusions: Routine gastric residual volume measurement is part of standard practice in U.K. PICUs, with little guidance provided about the technique which may impact the accuracy of gastric residual volume. All PICUs that defined feed tolerance included gastric residual volume in the definition. This is important to know when proposing a standard practice arm of any future trial of no-routine gastric residual volume measurement in critically ill children. The views expressed are those of the author (s) and not necessarily those of the NHS, the National Institute of Health Research or the Department of Health and Social Care. Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's website (http://journals.lww.com/pccmjournal). Supported by the National Institute of Health Research (Health Technology Assessment reference 16/94/02). Drs. Tume's, Arch's, Latten's, Deja's, Roper's, Eccleson's, Hickey's, Brown's, and Gale's institutions received funding from National Institute of Health Research (NIHR) Health Technology Assessment Programme. Drs. Tume, Arch, Latten, Roper, Pathan, Eccleson, Hickey, and Brown received support for article research from NIHR Health Technology Assessment Programme. Dr. Hickey's institution received funding from NIHR Efficacy and Mechanism Evaluation Programme (EME Ref (15):/20/01), and she received funding from University of Liverpool (personal fees relating to the production of a Clinical Study Report for University Hospitals Bristol NHS Foundation Trust). Ms. Brown's institution received funding from the NIHR Doctoral Fellowship Programme. Dr. Gale's institution received funding from Medical Research Council, Chiesi Pharmaceuticals, Canadian Institute of Health Research (CIHR), and Department of Health (England); he has received support from Chiesi Pharmaceuticals to attend an educational conference; in the past 5 years, he has been an investigator on received research grants from Medical Research Council, NIHR, CIHR, Department of Health in England, Mason Medical Research Foundation, Westminster Medical School Research Trust, and Chiesi Pharmaceuticals; and he received support for article research from Research Councils UK. Dr. Valla received funding from Baxter, Fresenius Kabi, and Nutricia. Dr. Dorling's institution received funding from National Institute for Health Research and Nutricia in 2017 and 2018 for part of his salary to work as an expert advisor on a trial of enteral insulin; he disclosed he was a member of the NIHR HTA General Board (from 2017 to 2018) and the NIHR HTA Maternity, Newborn and Child Health Panel (from 2013 to 2018); and he received support for article research from NIHR. The remaining authors have disclosed that they do not have any potential conflicts of interest. For information regarding this article, E-mail: lyvonne.tume@uwe.ac.uk ©2019The Society of Critical Care Medicine and the World Federation of Pediatric Intensive and Critical Care Societies

Immunopathology

Transcriptional control of macrophage polarisation in type 2 diabetes

Abstract

Type-2 diabetes (T2D) is considered today as an inflammatory disease. Inflammatory processes in T2D are orchestrated by macrophage activation in different organs. Macrophages undergo classical M1 pro-inflammatory or alternative M2 anti-inflammatory activation in response to tissue microenvironmental signals. These subsets of macrophages are characterised by their expression of cell surface markers, secreted cytokines and chemokines. Transcriptional regulation is central to the polarisation of macrophages, and several major pathways have been described as essential to promote the expression of specific genes, which dictate the functional polarisation of macrophages. In this review, we summarise the current knowledge of transcriptional control of macrophage polarisation and the role this plays in development of insulin resistance.



Correction to: The pathogenicity of Th17 cells in autoimmune diseases

Unfortunately, an error occurred in the following passus of the article. The word "receptor" was missing in the sentence "Because T cells do not express GM-CSF receptor [41], GM-CSF affects non-T cells."



Pathogenicity of acquired immunity in human diseases


A new therapeutic target: the CD69-Myl9 system in immune responses

Abstract

CD69 is an activation marker on leukocytes. Early studies showed that the CD69+ cells were detected in the lung of patients with asthmatic and eosinophilic pneumonia, suggesting that CD69 might play crucial roles in the pathogenesis of such inflammatory diseases, rather than simply being an activation marker. Intensive studies using mouse models have since clarified that CD69 is a functional molecule regulating the immune responses. We discovered that Myosin light chain 9, 12a, 12b (Myl9/12) are ligands for CD69 and that platelet-derived Myl9 forms a net-like structure (Myl9 nets) that is strongly detected inside blood vessels in inflamed lung. CD69-expressing activated T cells attached to the Myl9 nets can thereby migrate into the inflamed tissues through a system known as the CD69-Myl9 system. In this review, we summarize the discovery of the CD69-Myl9 system and discuss how this system is important in inflammatory immune responses. In addition, we discuss our recent finding that CD69 controls the exhaustion status of tumor-infiltrating T cells and that the blockade of the CD69 function enhances anti-tumor immunity. Finally, we discuss the possibility of CD69 as a new therapeutic target for patients with intractable inflammatory disorders and tumors



Treg cells in autoimmunity: from identification to Treg-based therapies

Abstract

Regulatory (Treg) cells are key regulators of inflammation and important for immune tolerance and homeostasis. A major progress has been made in the identification and classification of Treg cells. Due to technological advances, we have gained deep insights in the epigenetic regulation of Treg cells. The use of fate reporter mice allowed addressing the functional consequences of loss of Foxp3 expression. Depending on the environment Treg cells gain effector functions upon loss of Foxp3 expression. However, the traditional view that Treg cells become necessarily pathogenic by gaining effector functions was challenged by recent findings and supports the notion of Treg cell lineage plasticity. Treg cell stability is also a major issue for Treg cell therapies. Clinical trials are designed to use polyclonal Treg cells as therapeutic tools. Here, we summarize the role of Treg cells in selected autoimmune diseases and recent advances in the field of Treg targeted therapies.



CD8 + T cell exhaustion

Abstract

CD8+ T cells are important for the protective immunity against intracellular pathogens and tumor. In the case of chronic infection or cancer, CD8+ T cells are exposed to persistent antigen and/or inflammatory signals. This excessive amount of signals often leads CD8+ T cells to gradual deterioration of T cell function, a state called "exhaustion." Exhausted T cells are characterized by progressive loss of effector functions (cytokine production and killing function), expression of multiple inhibitory receptors (such as PD-1 and LAG3), dysregulated metabolism, poor memory recall response, and homeostatic proliferation. These altered functions are closely related with altered transcriptional program and epigenetic landscape that clearly distinguish exhausted T cells from normal effector and memory T cells. T cell exhaustion is often associated with inefficient control of persisting infections and cancers, but re-invigoration of exhausted T cells with inhibitory receptor blockade can promote improved immunity and disease outcome. Accumulating evidences support the therapeutic potential of targeting exhausted T cells. However, exhausted T cells comprise heterogenous cell population with distinct responsiveness to intervention. Understanding molecular mechanism of T cell exhaustion is essential to establish rational immunotherapeutic interventions.



Epigenetic regulation of T helper cells and intestinal pathogenicity

Abstract

Inflammatory bowel diseases (IBDs) are characterized by relapsing and remitting chronic intestinal inflammation. Previous studies have demonstrated the contributions of genetic background, environmental factors (food, microbiota, use of antibiotics), and host immunity in the development of IBDs. More than 200 genes have been shown to influence IBD susceptibility, most of which are involved in immunity. The vertebrate immune system comprises a complex network of innate and adaptive immune cells that protect the host from infection and cancer. Dysregulation of the mutualistic relationship between the immune system and the gut environment results in IBD. Considering the fundamental role of epigenetic regulation in immune cells, epigenetic mechanisms, particularly in T helper (Th) cells, may play a major role in the complex regulation of mucosal immunity. Epigenetic regulation and dysregulation of Th cells are involved in the maintenance of intestinal homeostasis and its breakdown in IBD.



The immunopathology of lung fibrosis: amphiregulin-producing pathogenic memory T helper-2 cells control the airway fibrotic responses by inducing eosinophils to secrete osteopontin

Abstract

Fibrosis is defined as excessive deposition of the extracellular matrix (ECM) in the parenchyma of various organs, and sometimes leads to irreversible organ malfunction such as idiopathic pulmonary fibrosis (IPF), a fatal disorder of the lung. Chronic inflammatory stimuli induce fibrotic responses in various organs. Various immune cells, including T helper (Th) cells in the lung, protect the host from different harmful particles, including pathogenic microorganisms. However, the dysregulation of the function of these immune cells in the lung sometimes causes inflammatory diseases, such as lung fibrosis. In this review, we will introduce an outline of the cellular and molecular mechanisms underlying the pathogenic fibrotic responses in the lung. We will also introduce the concept of the "Pathogenic Th population disease induction model," in which unique subpopulations of certain Th cell subsets control the pathology of immune-mediated inflammatory diseases. Finally, we introduce our recent findings, which demonstrate that amphiregulin-producing pathogenic memory Th2 cells control airway fibrosis through the osteopontin produced by inflammatory eosinophils. The identification of this new pathogenic Th cell population supports the concept of "Pathogenic Th population disease induction model", and will provide novel strategies for treating intractable diseases, including lung fibrosis.



T cell pathology in skin inflammation

Abstract

Forming the outer body barrier, our skin is permanently exposed to pathogens and environmental hazards. Therefore, skin diseases are among the most common disorders. In many of them, the immune system plays a crucial pathogenetic role. For didactic and therapeutic reasons, classification of such immune-mediated skin diseases according to the underlying dominant immune mechanism rather than to their clinical manifestation appears to be reasonable. Immune-mediated skin diseases may be mediated mainly by T cells, by the humoral immune system, or by uncontrolled unspecific inflammation. According to the involved T cell subpopulation, T cell–mediated diseases may be further subdivided into T1 cell–dominated (e.g., vitiligo), T2 cell–dominated (e.g., acute atopic dermatitis), T17/T22 cell–dominated (e.g., psoriasis), and Treg cell–dominated (e.g., melanoma) responses. Moreover, T cell–dependent and -independent responses may occur simultaneously in selected diseases (e.g., hidradenitis suppurativa). The effector mechanisms of the respective T cell subpopulations determine the molecular changes in the local tissue cells, leading to specific microscopic and macroscopic skin alterations. In this article, we show how the increasing knowledge of the T cell biology has been comprehensively translated into the pathogenetic understanding of respective model skin diseases and, based thereon, has revolutionized their daily clinical management.



The pathogenicity of Th17 cells in autoimmune diseases

Abstract

IL-17-producing T helper (Th17) cells have been implicated in the pathogenesis of many inflammatory and autoimmune diseases. Targeting the effector cytokines IL-17 and GM-CSF secreted by autoimmune Th17 cells has been shown to be effective for the treatment of the diseases. Understanding a molecular basis of Th17 differentiation and effector functions is therefore critical for the regulation of the pathogenicity of tissue Th17 cells in chronic inflammation. Here, we discuss the roles of proinflammatory cytokines and environmental stimuli in the control of Th17 differentiation and chronic tissue inflammation by pathogenic Th17 cells in humans and in mouse models of autoimmune diseases. We also highlight recent advances in the regulation of pathogenic Th17 cells by gut microbiota and immunometabolism in autoimmune arthritis.



Primatology

The Effect of Dominance Rank on the Distribution of Different Types of Male–Infant–Male Interactions in Barbary Macaques ( Macaca sylvanus )

Abstract

In several cercopithecine species males exhibit a specific type of male–infant–male interaction during which two males briefly manipulate an infant. These interactions typically occur after a male carrying an infant (infant holder) approaches or is approached by another male who is not holding an infant (infant nonholder). The agonistic buffering and relationship management hypotheses explain these interactions as a tool to establish and maintain social bonds among males. Both hypotheses predict that males preferentially use the opportunity to interact and bond with males dominant to themselves. However, the agonistic buffering hypothesis predicts that males preferentially initiate male–infant–male interactions with the highest ranking males available, whereas the relationships management hypothesis predicts that males are more likely to interact with males that are close to them in rank. To test these predictions, we collected data on 1562 male–infant–male interactions during 1430 hours of focal observation of 12 infants in one group of wild Barbary macaques (Macaca sylvanus) in Morocco. Using generalized linear mixed-effect models we found that males preferably initiated interactions with males that were dominant to them. However, we observed this effect only for interactions initiated by the infant holder. In interactions initiated by non-holders, the receiver's relative rank did not predict the frequency of interactions. Males also initiated more interactions with males close in rank to themselves than distantly ranked males. Our results support the relationship management hypothesis, but also indicate that the different types of male–infant–male interactions may require different explanations.



Interpreting People's Behavior Toward Primates Using Qualitative Data: a Case Study from North Morocco

Abstract

People's perceptions of primates vary across and within cultures and may not be consistent with their behavior toward the primates themselves. We used qualitative data from semistructured and unstructured interviews with shepherds from 10 villages around Bouhachem oak forest in Morocco to describe and discuss shepherds' behavior when they encounter Barbary macaques (Macaca sylvanus). When macaques enter agricultural fields to feed on crops, mature men trap, attach a marker to them (a hat or a rattle), and release them. In contrast, young men and boys working as shepherds hunt and kill macaques when they encounter them in the forest. We interpret these findings in the context of the historical, social, and cultural factors that underlie these cross-species encounters. We suggest the different ways men behave toward macaques over their lives are related to their age and social status. Understanding that men's behavior varies, and changes over the life course, we continued to engage positively with shepherds of all ages, sharing general information about the macaques and conducting community projects benefiting villagers' health. This strategy led shepherds from six villages to stop hunting macaques, with the behavior of young men and boys changing to reflect that of older men. We suggest that gaining a deep, contextualized understanding of the human–primate interface and fostering intrinsic values for a species are effective in gaining communities' support and fundamental to facilitating changes in people's behavior in favor of conservation.



The Contributions to Primatology of Colin P. Groves (1942–2017): Corecipient of the Lifetime Achievement Award of the International Primatological Society, 2018


Correction to: Camera Traps Clarify the Distribution Boundary between the Crested Black Macaque ( Macaca nigra ) and Gorontalo Macaque ( Macaca nigrescens ) in North Sulawesi

The original version of this article unfortunately contained a mistake in Figure 1. The revised figure is presented below:



Dorothy Cheney (1950–2018)


Camera Traps Clarify the Distribution Boundary between the Crested Black Macaque ( Macaca nigra ) and Gorontalo Macaque ( Macaca nigrescens ) in North Sulawesi

Abstract

Primates are among the most threatened taxa of mammals in the world. Tracking the status of primates requires continually assessing population distribution, abundance, and threats, which in turn requires the extent of a species' occurrence to be known. Defining this important parameter in practice can be difficult. In this article we demonstrate how camera traps can be used to address this with a case study involving two macaque species on the northernmost peninsula of Sulawesi, Indonesia. We deployed 83 camera traps across the suspected interface between the Critically Endangered Macaca nigra and the Vulnerable Macaca nigrescens. Using spatially explicit photographic records of both species, we found the boundary between the two species is 14.85 km farther west than previously defined. We estimate that the additional area encompassed by this new boundary location equates to 224 km2 of suitable habitat for M. nigra, an increase of 7.5%. This has important implications for more accurately assessing the threatened status of both species in the future. As camera traps become cheaper, their deployment at broader spatial scales is becoming more feasible, which in turn provides opportunities to enhance our ecological understanding of species. Here, we demonstrate an additional insight that can be gained from such technology, by showing how the range extent of a Critically Endangered primate can be accurately demarcated. Accordingly, we encourage primatologists to think more broadly about the possible applications of camera traps and to include them as tools in their conservation inventories.



Phylogeography, Population Genetics, and Conservation of Javan Gibbons ( Hylobates moloch )


Craniofacial Shape and Nonmetric Trait Variation in Hybrids of the Japanese Macaque ( Macaca fuscata ) and the Taiwanese Macaque ( Macaca cyclopis )

Abstract

It has become apparent that natural hybridization is far more common and may play a much greater role in evolution than has historically been recognized. The skeletal morphology of hybrid primates is notoriously variable and difficult to predict. Indeed, before the advent of genetic sequencing techniques, many wild hybrid populations went undetected. Though many species of primates are now known to hybridize naturally and are likely to have done so for millions of years, anthropogenic alterations to the environment are increasingly restricting or altering primate species ranges and contact zones and driving hybridization between populations that may otherwise never have come into contact. The case of hybridizing Japanese and Taiwanese macaques (Macaca fuscata and Macaca cyclopis) is an excellent example of this, as these two island species could not have come into contact without human interference. Here we apply 3D geometric morphometrics and nonmetric trait analysis to the crania and dentition of hybrid macaques (N = 70) and their parental species, M. fuscata (N = 57) and M. cyclopis (N = 51). The exploration of 3D shape variation identifies mildly transgressive morphology in the hybrids and a general tendency toward the M. fuscata morphotype overall, but less variability in the hybrid morphotype than has been identified in previous studies of primate hybrids. We also identify a small number of nonmetric traits that differentiate the hybrids from the parental species, although the power of these traits to distinguish between groups is weak and their relationship with hybridity is unclear. We conclude that the relatively short divergence time between the parent species is likely to help explain the observed differences in hybrid morphotype, and that further exploration of the relationship between degree of evolutionary divergence and hybrid morphology may help us to better explain and predict hybrid morphology in other taxa.



An Application of Autonomous Recorders for Gibbon Monitoring

Abstract

Population monitoring is very important in wildlife management and conservation. All 18 species of gibbons are considered threatened with extinction and listed on the IUCN Red List of Threatened Species. Thus, understanding and effectively monitoring their population trends and distribution are critical. Thus far, all gibbon surveying and monitoring programs have been conducted by human surveyors; this is expensive, laborious, and dependent on the surveyors' skills. In particular, estimating group density often requires a large sample size with several skilled observers working simultaneously in the field. We used autonomous recorders to record the calls of southern yellow-cheeked crested gibbon (Nomascus gabbrielae) for at least 3 days at each of 57 posts in Nam Cat Tien sector, Cat Tien National Park, Vietnam from July to October, 2016. We extracted gibbon calls from the recordings auditorily or visually using spectrograms in RAVEN software. We detected gibbon calls at 40 recording posts during the survey. The proportion of recorders with gibbon calls in the eastern region of Nam Cat Tien sector (mean = 0.79; SE = 0.13) was higher than that in the western region (mean = 0.46; SE = 0.11). The estimated probability of occurrence in the eastern region (ψ = 0.56; SE = 0.20) was higher than that in the western region (ψ = 0.23; SE = 0.16). Passive acoustic data were useful to investigate spatial variation in the probability of occurrence of gibbon. We recommend using autonomous recorders combined with occupancy model to complement human surveyors in gibbon monitoring in areas with low gibbon density because it is efficient, low cost, and not subject to errors caused by human surveyors. In the areas of high gibbon density, absolute density estimate achieved by human surveyors might be a more suitable indicator.



Is There a Link Between Matriline Dominance Rank and Linear Enamel Hypoplasia? An Assessment of Defect Prevalence and Count in Cayo Santiago Rhesus Macaques ( Macaca mulatta )

Abstract

Linear enamel hypoplasias are developmental defects ranging in appearance from microscopic to macroscopic furrows in enamel that encircle the tooth crown. Environmental stressors, including lack of food and infectious diseases during early periods of development, are known to induce hypoplasias in human and nonhuman primates. Social correlates of hypoplasias have not been extensively studied, however. Here, we examined the relationship between matriline dominance rank and linear enamel hypoplasia prevalence (i.e., absence or presence) and count (the total number of hypoplasias observed) in free-ranging rhesus monkeys (Macacca mulatta) in Cayo Santiago, Puerto Rico. We sampled 86 female offspring from low-, mid-, and high-ranking matrilines. Our results show that although hypoplasia prevalence and count were numerically higher in the combined group of low-and mid-ranking matrilines than in high-ranking matrilines, this effect was not statistically significant. There was, however, a significant negative relationship between age and hypoplasia prevalence, as well as between age and mean number of enamel defects, likely due to the attrition and abrasion of enamel that wear away shallow defects as individuals age. Future studies would benefit from using large sample sizes and collecting detailed behavioral data to determine if and when social status mediates enamel defect formation.



Der Radiologe

Gesetzlich geregelte Teleradiologie: Umsetzung der datenschutzrechlichen Anforderungen

Zusammenfassung

Mit Wirksamwerden des Strahlenschutzgesetzes (StrlSchG) und der Strahlenschutzverordnung (StrlSchV), welche die Röntgenverordnung (RöV) zum 31.12.2018 „ablöste", änderte sich wenig an den teleradiologiespezifischen Anforderungen. Allerdings gelten auch für die Teleradiologie die datenschutzrechtlichen Anforderungen der Datenschutz-Grundverordnung (DS-GVO). Zunächst ist jede Verarbeitung von Gesundheitsdaten verboten, wenn kein Erlaubnistatbestand vorhanden ist. Sodann müssen die in Art. 5 DS-GVO festgelegten Grundsätze nachweislich eingehalten werden, Patienten müssen informiert werden, die Sicherheit der Daten gewährleistet werden usw. Grundlegend muss geklärt werden, auf welcher Rechtsgrundlage die Zusammenarbeit zwischen dem versorgenden Krankenhaus und dem Teleradiologen erfolgt: Ist es eine eigenständige Behandlung durch den Teleradiologen selbst? Oder ist es eher eine gemeinsame Verarbeitung im Sinne der DS-GVO? Werden neue Technologien z. B. zur Datenübertragung über eine Cloud-Anwendung genutzt, muss ggf. eine Datenschutz-Folgenabschätzung erfolgen. Wenngleich viele Anforderungen der DS-GVO allein schon durch die teleradiologischen Anforderungen adressiert werden, d. h. bei einer genehmigten Teleradiologie viele Anforderungen der DS-GVO schon umgesetzt sind, bleiben noch einige Punkte, die man sich genauer ansehen sollte.



Direkt in ein neurovaskuläres Zentrum oder „drip and ship"?

Zusammenfassung

Hintergrund

Der wissenschaftliche Nachweis der hohen Wirksamkeit der endovaskulären Schlaganfallbehandlung bei proximalem Gefäßverschluss („large vessel occlusion", LVO) hat dazu geführt, dass diese Therapie als Goldstandard bei Schlaganfallpatienten akzeptiert ist.

Ziel der Arbeit

In dieser Übersicht soll der Versuch unternommen werden, die verschiedenen Organisationsmodelle für die Thrombektomie vorzustellen und zu analysieren, welches Modell unter welchen Umständen zu bevorzugen ist.

Material und Methoden

In einer Analyse der jüngeren wissenschaftlichen Literatur werden die Modelle zur Optimierung des Patiententransportes („drip and ship" und „mothership") sowie der Optimierung der Verfügbarkeit von Interventionalisten („drip and drive" und „remote mentoring") vorgestellt und gegeneinander abgewogen. Zudem werden Überlegungen zu Thrombektomieraten und Prävalenz von LVOs und der Modellierung von Organisationsmodellen angestellt.

Ergebnisse

Ist der Ort des Schlaganfallpatienten genauso weit oder näher an einem „comprehensive stroke center" (CSC) gelegen wie ein „primary stroke center" (PSC), sollte der Patient per „mothership" direkt zum CSC zu transportiert werden. Ist hingegen ein PSC näher am Ort des Schlaganfalls als ein CSC und liegt die Zeit nach Symptombeginn im Lysezeitfenster hängt diese Entscheidung von vielen Variablen ab.

Diskussion

Basierend auf der nicht eindeutigen Datenlage kann derzeit keine Empfehlung für ein allgemein überlegeneres Organisationsmodell gegeben werden.



Aktuelle CO 2 -Angiographie

Zusammenfassung

Hintergrund

Kohlendioxid (CO2) ist in der diagnostischen und interventionellen Angiographie eine sehr gut validierte Alternative zu jodhaltigem Kontrastmittel. Trotzdem ist seine routinemäßige Nutzung immer noch auf spezialisierte Zentren limitiert.

Fragestellung

Darstellung der derzeitigen Rolle und Einschränkungen von CO2 in der diagnostischen und interventionellen Angiographie (venös und arteriell).

Material und Methode

Es wurde eine umfassende Literaturrecherche zur CO2-Angiographie (physikalische Merkmale, Indikationen, Kontraindikationen, Anwendungen) durchgeführt.

Ergebnisse

Kohlendioxid kann als sichere Alternative für die Diagnose und Unterstützung von Interventionen in vielen arteriellen und venösen Gefäßterritorien verwendet werden, mit Ausnahme der arteriellen Anwendung oberhalb des Zwerchfells, die die wichtigste Kontraindikation darstellt. Darüber hinaus ist CO2 aufgrund seiner niedrigen Viskosität effektiver als jodhaltiges Kontrastmittel, um Blutungen aus kleinen Gefäßen zu detektieren.

Schlussfolgerungen

Die CO2-Angiographie ist eine sichere und effektive Technik und kann als hilfreiche Alternative eingesetzt werden. In einigen Fällen bietet sie sogar einige Vorteile gegenüber jodhaltigen Kontrastmitteln.



Kontrastmittelfreie Magnetresonanzangiographie

Zusammenfassung

Hintergrund

Der Einsatz von Magnetresonanztomographie(MRT-)Kontrastmitteln soll bei MR-Angiographien (MRA) minimiert werden.

Fragestellung

Übersicht über existierende native MRT-Techniken für die MR-Angiographie

Material und Methode

Native MRT-Angiographien nutzen aus, dass ungesättigtes fließendes Blut hyperintens zu statischem Gewebe ist („Time-of-flight"-MRA), dass Fluss eine darstellbare Phasenverschiebung induziert („Phase-contrast"-MRA), dass Blutmarkierung durch selektive Inversion eine Boluspassage dynamisch darstellt („arterial spin labeling") und dass spezielle MRT-Sequenzen die Kontrasteigenschaften von Blut nutzen.

Schlussfolgerungen

Native MRT-Angiographien können bei geeigneter Wahl Gefäße wie Hirnarterien und Koronarien zuverlässig darstellen und zusätzliche Informationen über die Flussdynamik liefern.



Diffusionsbildgebung – diagnostische Erweiterung oder Ersatz von Kontrastmitteln in der Früherkennung von Malignomen?

Zusammenfassung

Die medizinische Forschung auf dem Gebiet der onkologischen bildgebenden Diagnostik mittels Magnetresonanztomographie inkludiert zunehmend auch diffusionsgewichtete Sequenzen. Die diffusionsgewichteten Sequenzen können je nach eingestellter Sequenzmodifikation unterschiedliche Diffusionsprozesse auf mikrostruktureller Ebene im Körper abbilden und ermöglichen zudem neben visuellen auch quantitative Analysen der erhobenen Bilddaten. Da diffusionsgewichtete Sequenzen keine Applikation gadoliniumhaltiger Kontrastmittel erfordern, sondern lediglich die Beweglichkeit der natürlicherweise im Körper vorhandenen Wassermoleküle quantifizieren, stellen sie ein diagnostisches Verfahren dar, das bei spezifischen Fragestellungen und abhängig von derzeitigen und zukünftigen Weiterentwicklungen potenziell eine eigenständige diagnostische Wertigkeit entwickeln könnte. Aktuelle klinisch-diagnostische Studien sowie die technischen Entwicklungen, auch unter Berücksichtigung des zunehmenden Einflusses der künstlichen Intelligenz auf die Radiologie, unterstützen diesen Prozess. Insbesondere im Bereich der selektiven Früherkennungsverfahren für Tumorerkrankungen könnte die diffusionsgewichtete Bildgebung einen wesentlichen Beitrag leisten. Vor einem klinischen Routineeinsatz ist jedoch die Etablierung einer Standardisierung und Qualitätssicherung unerlässlich.



Intravenöse Lysetherapie zur akuten Schlaganfalltherapie – neuester Stand

Zusammenfassung

Klinisches Problem

Die i.v. Lysetherapie und die mechanische Rekanalisation gelten als die zwei essenziellen Säulen der akuten Schlaganfalltherapie bei Patienten mit Gefäßverschluss im vorderen Stromgebiet. Zunehmend wird über die Durchführung einer sog. Bridging-Lysetherapie diskutiert.

Ergebnisse

Schlaganfallpatienten, die primär in ein neurovaskuläres Zentrum transportiert wurden und dort zeitnah eine endovaskuläre Therapie erhielten, zeigten nach Lysetherapie eine niedrigere präinterventionelle Rekanalisationsrate und niedrigere 90-Tage-Mortalität, aber keinen signifikanten Unterschied im klinischen Outcome nach 3 Monaten im Vergleich zu Schlaganfallpatienten mit einer alleinigen mechanischen Rekanalisation. Erhöhte intrakranielle Blutungsraten konnten in der Bridging-Lysetherapie-Gruppe detektiert werden.

Schlussfolgerung

Die i.v. Lysetherapie bleibt weiterhin ein notwendiges Behandlungskonzept bei einem akuten Schlaganfall. Weitere Untersuchungen hinsichtlich der Gabe bei einer zeitnahen endovaskulären Therapie sollten durchgeführt werden.



„Time is brain"

Zusammenfassung

Hintergrund

Beim akuten Schlaganfall stehen mit der Thrombolyse durch rt-PA und der interventionellen Thrombektomie evidenzbasierte kausal orientierte Therapien zur Verfügung. Der klinische Benefit für den Patienten ist jedoch sehr zeitabhängig.

Methoden

Es wird eine Übersicht über kritische Zeitintervalle beim akuten Schlaganfallmanagement gegeben und Möglichkeiten zur Beeinflussung werden dargestellt.

Ergebnisse

Sowohl prähospitale als auch innerhospitale Zeitabschnitte lassen sich verkürzen mit daraus resultierendem nachgewiesenem klinischem Benefit. Die Maßnahmen hierfür sind vielfältig und erfordern klare Verfahrensrichtlinien und konstantes Training.

Schlussfolgerung

Die Optimierung des Zeitmanagements über die gesamte Akutdiagnostik und -therapie durch das Vermeiden spezifischer Verzögerungen und die Verbesserung uniformer Arbeitsabläufe hat oberste Priorität für Effizienz und Sicherheit der genannten Therapien.



Cardiac CT: why, when, and how

Abstract

Purpose

The aim of this study was to review established and emerging techniques of cardiac computed tomography (CT) and their clinical applications with a special emphasis on new techniques, recent trials, and guidelines.

Technological innovations

Cardiac CT has made great strides in recent years to become an ever more robust and safe imaging technique. The improvements in spatial and temporal resolution are equally important as the substantial reduction in radiation exposure, which has been achieved through prospective ECG-triggering, low tube voltage scanning, tube current modulation, and iterative reconstruction techniques. CT-derived fractional flow reserve and CT myocardial perfusion imaging are novel, investigational techniques to assess the hemodynamic significance of coronary stenosis.

Established and emerging indications

In asymptomatic patients at risk for coronary artery disease, CT coronary artery calcium scoring is useful to assess cardiovascular risk and guide the intensity of risk factor modification. Coronary CT angiography is an excellent noninvasive test to rule out obstructive coronary artery disease in patients with stable chest pain. In acute chest pain with normal ECG and normal cardiac enzymes, cardiac CT can safely rule out acute coronary syndrome although its benefit and role in this indication remains controversial. Cardiac CT is the established standard for planning transcatheter aortic valve implantation and—increasingly—minimally invasive mitral valve procedures.

Practical recommendations

Our review makes practical recommendations on when and how to perform cardiac CT and provides templates for structured reporting of cardiac CT examinations.



Neues Forschungszentrum für Bildgebung und Radioonkologie am Deutschen Krebsforschungszentrum in Heidelberg


Pharmakokinetik von gadoliniumhaltigen Kontrastmitteln

Zusammenfassung

Hintergrund

Gadoliniumhaltige Kontrastmittel werden routinemäßig bei magnetresonanztomographischen Untersuchungen angewendet. In manchen Geweben sind sie noch nach längerer Zeit nachweisbar (Haut, Gehirn, Knochen).

Fragestellung

Was ist über die Pharmakokinetik von gadoliniumhaltigen Kontrastmitteln und über die Ablagerungen in Geweben bekannt?

Material und Methode

Grundlagenarbeiten und Expertenempfehlungen werden diskutiert.

Ergebnisse

Gadoliniumhaltige Kontrastmittel verteilen sich rasch im ganzen Körper und werden renal eliminiert. Auf eine initial schnelle Elimination (Halbwertszeit etwa 2 h) folgt eine langsame Eliminationsphase (Halbwertszeit etwa 6 Tage), welche die Freisetzung aus Geweben reflektiert. Ablagerungen im Gehirn treten insbesondere nach Anwendung von linearen, nichtionischen Kontrastmitteln auf. Unklar ist, ob es sich dabei um cheliertes oder um freies Gadolinium handelt und ob ansonsten gesunde Menschen gleichermaßen betroffen sind. Risiken durch Ablagerungen im Gehirn sind bisher nicht belegt.

Schlussfolgerung

Vor Durchführung einer Magnetresonanztomographie (MRT) mit gadoliniumhaltigen Kontrastmitteln sollte eine individuelle Abwägung erfolgen (erwarteter Nutzen der Bildgebung, möglicherweise noch unerkannte Risiken, Verfügbarkeit von Alternativen und deren Risiken). Eine Messung von Gadolinium in Urin oder Blut von Patienten ist, außerhalb von Studien, nicht sinnvoll.



Δημοφιλείς αναρτήσεις