PSU Volume 66 No 01 JANUARY 2026

Transition of Care of Pediatric Surgical Patients

The transition from pediatric to adult healthcare represents a critical and complex process, especially for surgical patients who have lived with congenital anomalies, chronic diseases, or post-operative conditions that require lifelong management. As advances in pediatric surgery, neonatal resuscitation, and perioperative care have significantly improved survival outcomes, many children with once-lethal conditions now reach adulthood. However, this increased longevity has created a new challenge: ensuring these patients continue to receive high-quality, condition-specific care within adult healthcare systems that may not be equipped to handle the nuances of pediatric-onset disorders.


Transition of care is not a singular moment but a multifaceted, developmental process that unfolds over time. It involves preparing adolescents and their families for the eventual transfer of clinical responsibility to adult healthcare teams. This shift must address not only the medical needs of the patient but also psychosocial, educational, and vocational aspects. Transition planning should promote autonomy, empower self-advocacy, and ensure continuity of care. Despite recognition of its importance, many pediatric surgical specialties still lack formal, structured transition programs, leading to gaps in care, delayed referrals, and suboptimal long-term outcomes.


One of the most pressing issues is the scarcity of adult providers trained or willing to manage patients with pediatric surgical histories. Many adult surgeons are unfamiliar with the long-term sequelae of congenital conditions or childhood interventions. For instance, patients who underwent correction for anorectal malformations or Hirschsprung disease in infancy may face lifelong bowel dysfunction, incontinence, or complications related to sexual and urological function. Similarly, those treated for esophageal atresia, spina bifida, or congenital cardiac disease often require surveillance for late-onset complications that fall outside the typical scope of adult care. Without appropriate transition, these individuals may disengage from follow-up until they present emergently—by which time their conditions may have progressed, sometimes irreversibly.


Pediatric surgical teams often continue care well into the patient's twenties, in part because of the lack of clear pathways or designated receiving adult services. While this continuity can offer reassurance, it creates ethical and logistical tensions. Pediatric providers must weigh their commitment to individual patients against the broader duty to care for new pediatric cases. The balance between beneficence—ensuring the best outcomes for the individual—and justice—fair distribution of finite resources—is delicate. The absence of transition policies also increases the risk of violating patient autonomy, particularly when adolescents are not included in decisions about their future care or are unprepared to assume responsibility for managing their health.


Surveys of surgical and medical professionals confirm widespread recognition of these challenges. A substantial majority acknowledge the benefits of structured transition models and advocate for standardized protocols. Yet implementation remains patchy, hindered by a lack of resources, training, and institutional support. Many health systems do not mandate transition policies, and even where guidelines exist, they are often not tailored to the surgical population. The result is a fragmented, inconsistent experience for patients and families.


Disease-specific approaches have emerged in response to this gap. In chronic intestinal failure, where patients rely on home parenteral nutrition, international experts have developed consensus protocols grounded in clinical experience and theory. These protocols emphasize early initiation of transition planning—ideally 1–2 years before the anticipated transfer—and include joint clinics staffed by pediatric and adult teams, transitional care coordinators, and standardized readiness assessments. Tailored interventions, such as education on catheter care and emergency management, address the unique risks these patients face. Central to the protocol's success is a nurse specialist embedded in both the pediatric and adult services, serving as a bridge and consistent presence throughout the transition.


Similarly, research on adolescents with inflammatory bowel disease underscores the importance of maturity-based rather than age-based transitions. Clinical data reveal that early-onset Crohn's disease is often more aggressive than its adult-onset counterpart, associated with higher surgical risk and greater likelihood of biologic therapy. This phenotype, combined with the psychosocial burden of a chronic illness during adolescence, demands a nuanced, individualized transition plan. A failure to manage the transition effectively can lead to disease flares, reduced medication adherence, and poor health-related quality of life.


In colorectal surgery, conditions such as anorectal malformations and Hirschsprung disease present particular transition challenges. Patients often develop a dependency on their pediatric teams and express reluctance to engage with unfamiliar adult services. Many lack understanding of their own medical history or the implications of their condition in adulthood. Concurrently, adult providers frequently report insufficient knowledge of pediatric colorectal pathologies and express discomfort managing their sequelae. In the absence of structured joint clinics or shared protocols, these patients face a heightened risk of discontinuity of care.


A systematic review of transitional care in colorectal surgery identified three overarching categories of barriers: patient-related (e.g., limited knowledge, psychological readiness), provider-related (e.g., inadequate training, lack of communication), and system-level (e.g., absence of joint clinics, lack of standardized pathways). Solutions to these challenges include fostering autonomy and self-efficacy among patients through education, training adult clinicians in pediatric-onset conditions, and establishing multidisciplinary transition clinics. The creation of condition-specific guidelines is also critical, as generic transition frameworks often fail to capture the complexities of surgical diseases.


From an ethical standpoint, transition of care must be grounded in respect for autonomy, beneficence, non-maleficence, and justice. Transition planning should begin early, with adolescents actively participating in decisions about their future care. Patients must be given the tools to develop health literacy, navigate insurance changes, and manage treatment regimens independently. Simultaneously, institutions have a responsibility to ensure that adult providers are prepared and resourced to care for this unique population. Without such infrastructure, patients may face gaps in care or even harm.


One underappreciated dimension of transition is the role of genetics and personalized medicine. In inflammatory bowel disease, for example, studies have demonstrated significant associations between specific genetic polymorphisms and disease severity, location, and likelihood of surgery. Knowledge of a patient's genetic risk profile could inform timing and content of transition planning, particularly for those at greater risk of complications. It may also guide selection of adult providers with expertise in managing high-risk cases.


Despite growing awareness and international efforts to define best practices, many health systems remain unprepared to implement comprehensive transition models. Barriers include limited personnel, lack of training, and inadequate reimbursement for transition-related services. Moreover, the absence of a formal transition policy at the national or institutional level means that many clinicians continue to rely on informal, ad hoc arrangements. This variability exacerbates disparities in care, particularly for patients from underrepresented backgrounds or those with complex social needs.


Effective transition of care for pediatric surgical patients requires more than institutional commitment; it demands cultural change across pediatric and adult disciplines. Adult providers must be equipped not only with clinical knowledge but also with an appreciation of the developmental and emotional needs of patients who are emerging from a highly supportive pediatric environment. Conversely, pediatric teams must be willing to relinquish care at the appropriate time and to support patients and families through the uncertainty of change.


At its core, successful transition is about preserving the continuity, quality, and humanity of care. The transition process must respect the history of each patient's journey while preparing them for the road ahead. It requires coordination, communication, and above all, a shared commitment to supporting young adults as they move forward with their lives.


References:
1- Grossklaus H, Barnett S. Reflection on young adult transitional care in the Boston Children's Hospital Perioperative Care Coordination Clinic. J Pediatr Nurs. 62:184-187, 2022
2- Plascevic J, Shah S, Tan YW. Transitional Care in Anorectal Malformation and Hirschsprung's Disease: A Systematic Review of Challenges and Solutions. J Pediatr Surg. 59(6):1019-1027, 2024
3- Moore EJ, Sawyer SM, King SK, Tien MY, Trajanovska M. Transition From Pediatric to Adult Healthcare for Colorectal Conditions: A Systematic Review. J Pediatr Surg. 59(6):1028-1036, 2024
4- Demirok A, Benninga MA, Diamanti A, El Khatib M, Guz-Mark A, Hilberath J, Lambe C, Norsa L, Pironi L, Sanchez AA, Serlie M, Tabbers MM. Transition from pediatric to adult care in patients with chronic intestinal failure on home parenteral nutrition: How to do it right? Clin Nutr. 43(8):1844-1851, 2024
5- Sia WT, Tay JC, Lee TC, Nah SA, A Nallusamy MA, Mahendran HA. Current practice and barriers for transition of care (TOC) in pediatric surgery: perspectives of adult surgeons from different subspecialties. Pediatr Surg Int. 41(1):76, 2025
6- Carlisle EM, Sundland R, Shakhsheer B, Arnold M, Lee J, Mills J, Martin K, Mueller C, Gow K. Ethics of Transition of Care of Pediatric Surgical Patients to Adult Providers. J Pediatr Surg. 60(4):162228, 2025
7- Mocci G, Orrù G, Onidi FM, Corpino M, Marongiu A, Argiolas GM, Runfola M, Manunza R, Locci G, Tamponi E, Zolfino T, Usai Satta P, Muscas A, Rossino R, Savasta S, Congia M. Clinical and Genetic Characteristics of Pediatric Patients with Inflammatory Bowel Disease Transitioning to Adult Medicine: A Single-Center Ten-Year Experience. J Clin Med. 14(11):3741, 2025

Syndromic Biliary Atresia

Syndromic biliary atresia represents a distinct minority of biliary atresia cases but poses challenges that differ from isolated disease. About ten percent of infants with biliary atresia present with splenic and laterality anomalies that define the biliary atresia splenic malformation spectrum. These patients frequently show polysplenia, double spleens, or rarely asplenia, in combination with features such as situs inversus, preduodenal portal vein, absent inferior vena cava, intestinal malrotation, and a broad range of cardiac anomalies. The presence of these abnormalities signals a developmental disturbance early in embryogenesis. Imaging and operative findings often show atypical vascular arrangements, single or preduodenal portal veins, and altered biliary anatomy. These anomalies complicate exposure and reconstruction but do not preclude success when handled with sound technique.


The syndrome begins with early onset of biliary obstruction. Several studies note that infants with splenic malformations tend to declare their disease sooner than non-syndromic infants, which drives them to surgery at a younger age. Pooled data confirm this pattern, showing that the age at Kasai operation is consistently lower by about ten to thirteen days among those in the syndromic group. The earlier disease onset appears to reflect an intrinsic pace of biliary injury rather than differences in referral patterns. In some cases, early stools may still contain bile, misleading clinicians and briefly delaying diagnosis, but the underlying process remains aggressive.


Early surgery is a clear advantage and remains the most important modifiable factor in management. Across the collected series, successful Kasai reconstruction within the first sixty days of life offers the best chance of long-term native liver survival. Even in syndromic disease, timely surgery supports meaningful long-term outcomes. Operative illustrations in the reports show absence of gallbladder or biliary tree, a fibrotic portal plate, and anomalous portal venous structures. These views remind surgeons that dissection planes may differ, and exposure must be deliberate. The dissection should aim to maximize the surface of exposed microscopic ductules while remaining alert for aberrant portal vein courses. A single portal vein or preduodenal configuration demands precise management to avoid injury, particularly during the portoenterostomy.


Historically, syndromic biliary atresia carried a reputation for poor outcomes. Several earlier reports suggested higher rates of postoperative complications, including cholangitis, persistent jaundice, and portal hypertension with variceal bleeding. Some believed that prognosis was impaired because the anomalies made the Kasai operation more complex, particularly in the presence of situs inversus or severe vascular rearrangement. Others emphasized that associated cardiac defects may influence outcomes independently. Yet the cumulative evidence from the seven reviewed studies presents a more balanced picture.


When outcomes across more than two thousand patients were pooled, no significant difference was found between syndromic and non-syndromic groups for jaundice clearance or native liver survival. Clearance of jaundice showed overlapping performance between groups, with odds ratios near unity and low heterogeneity. Native liver survival at follow-up periods ranging from one to twenty years also showed no clear divergence between cohorts. Differences noted in some earlier work appear to be a function of small cohorts, variable definitions, and inconsistent follow-up durations. Once follow-up time is standardized across subgroups, the survival curves between groups lose their earlier separation.


The operative challenges remain real, yet they do not reliably predict failure. Increasing surgical experience and improved imaging to define vascular anomalies help to equalize outcomes. Preoperative planning with high-quality ultrasound, magnetic resonance imaging, and attention to the portal venous trajectory is essential. Surgeons should anticipate aberrant structures and approach the hilum with caution. Despite these challenges, the technical goals remain the same as for isolated disease: complete clearance of fibrotic tissue overlying the portal plate and an unobstructed, tension-free Roux limb.


The postoperative course follows a familiar pattern. Infants face risk of cholangitis, persistent jaundice, hepatopulmonary syndrome, and progressive portal hypertension. Cholangitis stands out as a key predictor of poor outcomes when recurrent or severe. Some centers describe improvement with early postoperative steroid therapy, noting better bilirubin profiles and increased rates of jaundice clearance within the first six months. Steroids are usually tapered within a few months. While their long-term influence remains debated, they may optimize early recovery in selected cases.


New data from long-term cohorts bring a surprising insight into the biology of syndromic disease. Studies analyzing biomarkers of liver fibrosis over more than three decades found that syndromic groups, both BASM and non-BASM variants, showed lower markers of fibrosis compared with isolated biliary atresia. Platelet counts were higher at the time of Kasai and at three-year follow-up, suggesting less splenic sequestration and milder portal hypertension. The AST-to-platelet index ratio was lower in syndromic infants compared with isolated disease, and the varices prediction rule consistently showed a lower likelihood of significant varices. Endoscopic findings confirm this biologic signal. Clinically significant varices were less common in syndromic biliary atresia, with rates near four percent compared with more than twenty percent in isolated disease. This pattern indicates that, despite early onset of biliary obstruction, the long-term fibrotic trajectory may be milder in some forms of syndromic biliary atresia.


The mechanism behind this is not fully understood. Genetic studies point toward variants in laterality and ciliopathy pathways, including genes such as CFC1 and PKD1L1. Other associated pathways influence ciliary morphogenesis, bile duct development, and laterality specification. Abnormal embryologic signaling may produce the extrahepatic anomalies that define the syndrome and simultaneously influence liver fibrosis patterns. Some studies suggest environmental contributions, including maternal diabetes in selected cases. The heterogeneity of associated conditions, such as Cat-Eye syndrome, Kabuki syndrome, Hardikar syndrome, and a wide range of nonrandom congenital associations, supports the view that syndromic biliary atresia is not a single disease but a convergence of several developmental disorders that share a final pathway of biliary obstruction.


Understanding these mechanisms may eventually guide personalized strategies. For surgeons, though, the immediate implication is that syndromic anatomy does not always predict poor long-term liver health. A child with polysplenia or malrotation may still experience milder portal hypertension years after surgery compared with a child with isolated disease. The risk of requiring transplantation remains present, but some cohorts show durable native liver survival in both BASM and non-BASM syndromic groups comparable to isolated disease.


These findings do not imply complacency. Careful follow-up remains necessary because syndromic infants still display wide variability in postoperative stability. Even when jaundice clears early, subtle progression of fibrosis or late complications such as hepatopulmonary syndrome can occur. Surveillance strategies should include routine liver biochemistry, growth assessment, ultrasound with Doppler evaluation, endoscopy when clinically indicated, and consideration of biomarkers such as APRi and VPR to guide timing of intervention. Early management of complications can prevent avoidable morbidity.


While the literature shows no significant difference in survival, the limited number of controlled studies and small cohort sizes restrict firm conclusions. Many analyses rely on retrospective data, and definitions of jaundice clearance vary across publications. Time-to-event strategies like Kaplan-Meier or Cox regression are ideal but often under-reported. Future studies with larger, prospective designs may establish clearer prognostic signals and help distinguish which anatomic or genetic subgroups behave differently.


As current data stand, the key principles for surgeons treating syndromic biliary atresia remain consistent. Early diagnosis is essential. Early Kasai reconstruction within the first two months improves the chance of long-term survival, regardless of anatomic anomalies. Detailed preoperative imaging and deliberate intraoperative technique can overcome the hurdles posed by atypical vascular or biliary anatomy. Postoperative care mirrors that of isolated biliary atresia, though syndromic patients may show different patterns of fibrosis and portal hypertension. Steroid-enhanced recovery in the early postoperative period may play a role for selected patients. Long-term follow-up should be individualized and attentive to baseline anomalies.


Syndromic biliary atresia is an important subgroup that challenges the surgeon in diagnosis, operative planning, and long-term care. Yet the collective evidence shows that, when treated early and skillfully, these infants can achieve outcomes similar to those with isolated disease. Variability persists, but syndromic anatomy alone should not dictate prognosis or limit the pursuit of full biliary reconstruction. With continued advances in imaging, genetics, and postoperative management, surgeons can approach syndromic biliary atresia with greater clarity, grounded in the knowledge that careful technique and early intervention remain the strongest determinants of success.


References:
1. Xu X, Dou R, Zhao S, Zhao J, Gou Q, Wang L, Zhan J. Outcomes of biliary atresia splenic malformation (BASM) syndrome following Kasai operation: a systematic review and meta-analysis. World J Pediatr Surg. 5(3):e000346, 2022
2. He L, Chung PHY, Lui VCH, Tang CSM, Tam PKH. Current Understanding in the Clinical Characteristics and Molecular Mechanisms in Different Subtypes of Biliary Atresia. Int J Mol Sci. 23(9):4841, 2022
3. So K, Shinagawa T, Yoshizato T, Fukahori S, Asagiri K, Maeno Y, Hayashida S, Ushijima K. Difficulty in the Diagnosis of Biliary Atresia Splenic Malformation Syndrome In Utero. Kurume Med J. 68(3–4):265–268, 2023
4. Alhashmi H, Chawshly E, Çelebi S. Biliary atresia-splenic malformation (BASM) syndrome: A case report. Int J Surg Case Rep. 121:109937, 2024
5. Davenport M. Updates in Biliary Atresia: Aetiology, Diagnosis and Surgery. Children (Basel). 12(1):95, 2025
6. Davenport M. Syndromic variants of biliary atresia. World J Pediatr Surg. 8(3):e001040, 2025
7. Schwarz D, Lam C, Davenport M. Long-term reduction of liver fibrosis surrogates in syndromic biliary atresia. J Pediatr Surg. 60(11):162626, 2025

Pyomyositis

Pyomyositis is a primary bacterial infection of skeletal muscle that leads to localized inflammation and frequently abscess formation. Once considered a disease largely confined to tropical climates, its growing recognition in temperate regions has reshaped epidemiologic assumptions and raised new clinical questions. Across recent observational studies, systematic reviews, and case series, pyomyositis emerges as a complex infection influenced by microbial, host, and environmental factors, and one that continues to challenge clinicians because of its variable presentation and potential for severe complications.


The historical characterization of pyomyositis as "tropical myositis" reflected early epidemiologic patterns, with reports from sub-Saharan Africa and other equatorial areas describing high burdens and notable contributions to surgical caseloads. A systematic review of global studies underscores this origin while confirming increasing reports from temperate regions, where previously the disease had been rare. This geographic shift is echoed in pediatric studies from Europe and Japan, which document a rising number of cases over the past decades. The consistency of these reports argues that pyomyositis is no longer a tropical infection alone, but a globally relevant condition influenced by both environmental exposure and host susceptibility.


From a demographic perspective, pyomyositis affects children and adults of all ages, but many cohorts report a predominance of cases in the pediatric population. Several studies note a skew toward males, particularly in younger age groups, with one meta-analysis identifying males under 20 years as the most frequently affected demographic. In a decade-long pediatric study focusing on pelvic pyomyositis, two age peaks were identified: children under two years and adolescents, suggesting different exposure profiles or physiological vulnerabilities across developmental stages.


Trauma has long been recognized as a contributing factor, and multiple cohorts reinforce this association. In one pediatric pelvis-focused study, nearly a quarter of children reported recent trauma, and some had associated skin lesions or recent intramuscular injections. In a large extremity-focused series, trauma preceded symptom onset in nearly 70 percent of cases, highlighting the possibility that muscle microinjury enables bacterial seeding or growth .


Immunosuppression also plays a critical role. A meta-analysis evaluating factors associated with pyomyositis found strong correlations with HIV infection and advanced immunosuppression, with odds ratios as high as six for those with AIDS-defining illness. Other reported comorbidities include diabetes, hematologic malignancies, renal disease, and autoimmune conditions. Even among otherwise healthy children, concurrent viral or respiratory infections are not uncommon, raising the possibility that transient immune modulation may increase vulnerability to bacterial invasion.


Across studies, Staphylococcus aureus remains the dominant pathogen. Numerous pediatric cohorts describe S. aureus isolation in 30 to 90 percent of culture-positive cases, with both methicillin-sensitive and methicillin-resistant strains represented . Several systematic reviews similarly identify S. aureus as the principal organism, accounting for nearly four out of five cases in some analyses. In one pediatric case series from Somalia, cultures grew S. aureus in 69 percent of patients, reinforcing the organism's central role across continents and healthcare environments.


Despite the predominance of S. aureus, less common pathogens are increasingly recognized. A detailed pediatric case report described pyomyositis caused by Streptococcus pneumoniae, a rare but important cause of invasive muscle infection. The report highlighted severe systemic inflammation, rapid progression, and abscess formation involving multiple pelvic muscles . Such cases emphasize that while empirical therapy should prioritize staphylococcal coverage, clinicians must remain alert to atypical causes, particularly in severe or refractory disease.


The presentation of pyomyositis is notoriously variable. Early symptoms are non-specific and may include low-grade fever, localized muscle pain, and subtle functional limitations. These features often mimic more common conditions such as muscle strain or transient synovitis, contributing to diagnostic delays. In the pediatric pelvis-focused study, pain, functional limitation, and fever were the most frequent presenting symptoms, yet diagnostic delays averaged five days from symptom onset and were even longer in younger children, who tend to present with irritability rather than localized pain.


Across multiple reports, the muscles of the pelvis and thigh are the most frequently involved sites, including the iliopsoas, obturator muscles, and gluteal groups. A Japanese 32-year institutional review demonstrated that more than 90 percent of pyomyositis cases involved muscles adjacent to the hip joint, reinforcing this anatomical pattern in temperate climates as well. Studies focusing on extremity involvement note similar trends, with lower extremity muscles affected more often than upper extremity groups.


Laboratory markers such as C-reactive protein, erythrocyte sedimentation rate, and white blood cell count are usually elevated but lack diagnostic specificity. In pediatric patients, C-reactive protein tends to be significantly higher in older children, while very young infants may show subtler laboratory abnormalities despite significant infection .


Imaging is the cornerstone of diagnosis. Magnetic resonance imaging (MRI) consistently emerges as the most reliable modality, capable of detecting intramuscular edema, abscesses, and involvement of adjacent structures with high sensitivity. In the pelvic pyomyositis study, MRI detected abnormalities in every case, even when clinical and laboratory findings were inconclusive. The addition of diffusion-weighted imaging further enhanced detection of deep or early infection .


Systematic reviews confirm MRI as the diagnostic gold standard, though ultrasound remains a useful initial tool due to its accessibility, particularly for identifying fluid collections suitable for aspiration . Plain radiographs are generally unremarkable early in disease but can help rule out alternative diagnoses. In acute musculoskeletal infection more broadly, combined clinical, laboratory, and imaging findings are considered essential because no single test offers definitive sensitivity or specificity .


Pyomyositis has been traditionally divided into three stages: invasive, suppurative, and late or septic. Early disease can remain subtle for days before progressing to abscess formation. In the extremity-focused case series, most children presented after an average of two weeks of symptoms, reflecting the difficulty of recognizing early-stage disease and the rapid progression to abscess formation in many cases.


Antibiotic therapy and drainage remain the primary treatments. Empiric therapy typically includes coverage for methicillin-sensitive and methicillin-resistant S. aureus, with broad-spectrum regimens initiated when necessary. In a reported case of S. pneumoniae pyomyositis, initial broad coverage was appropriately narrowed once the organism was identified, but the child still required six weeks of combined intravenous and oral therapy, highlighting the prolonged courses often required for complete resolution.


Surgical drainage is often necessary, particularly in the suppurative stage. A systematic review found that medical therapy alone was successful in approximately 40 percent of cases but that surgical or percutaneous drainage was required in the remainder, especially when abscess size was significant or when clinical deterioration occurred. Larger inflammatory markers and more severe symptoms at presentation predicted the need for operative intervention.

In the large Somali pediatric series, all children underwent surgical debridement, with nearly universal recovery. Only a small subset developed complications such as osteomyelitis, demonstrating that with timely intervention, outcomes can be favorable even in resource-limited settings.


Complications across studies include osteomyelitis, septic arthritis, sepsis, and multifocal infection. Methicillin-resistant S. aureus has been identified as a predictor of more severe disease and higher complication rates. Pediatric reports also highlight the risk of multifocal infection, particularly in pelvic disease where adjacent joints and bones are frequently involved.


The emerging epidemiology of pyomyositis reflects broader changes in global health patterns. The increase in temperate-region cases reported in recent pediatric and adult cohorts suggests improved recognition, greater use of advanced imaging, or possibly true shifts in disease distribution. The Japanese 32-year institutional review documented a notable rise in cases in the past sixteen years compared to the previous sixteen, although the difference did not reach statistical significance. Nevertheless, this trend reinforces the need for heightened clinical suspicion, even in areas where pyomyositis was once considered rare.


Pyomyositis is a globally relevant infection marked by diverse presentations, evolving epidemiology, and the continued dominance of Staphylococcus aureus as the primary pathogen. Though early symptoms can be subtle, timely recognition is essential to prevent complications. MRI stands as the most sensitive diagnostic tool, and management typically requires a combination of targeted antimicrobial therapy and drainage of abscesses when present. Recent studies from Europe, Asia, and Africa highlight the infection's rising incidence in temperate regions and underline the importance of including pyomyositis in the differential diagnosis of musculoskeletal infections, particularly in children presenting with fever, pain, and functional limitation. Continued research is needed to clarify pathogenesis, refine diagnostic pathways, and optimize treatment strategies for this challenging and often under-recognized disease.


References:
1- Ngor C, Hall L, Dean JA, Gilks CF. Factors associated with pyomyositis: A systematic review and meta-analysis. Trop Med Int Health. 26(10):1210–1219, 2021
2- Vij N, Ranade AS, Kang P, Belthur MV. Primary Bacterial Pyomyositis in Children: A Systematic Review. J Pediatr Orthop. 41(9):e849–e854, 2021
3- Abbati G, Abu Rumeileh S, Perrone A, Galli L, Resti M, Trapani S. Pelvic Pyomyositis in Childhood: Clinical and Radiological Findings in a Tertiary Pediatric Center. Children (Basel). 9(5):685, 2022
4- Barchi L, Fastiggi M, Bassoli I, Bonvicini F, Silvotti M, Iughetti L, De Fanti A. Pyomyositis associated with abscess formation caused by Streptococcus pneumoniae in children: a case report and review of literature. Ital J Pediatr. 49(1):73, 2023
5- Sykes MC, Ahluwalia AK, Hay D, Dalrymple J, Firth GB. Acute musculoskeletal infection in children: assessment and management. Br J Hosp Med (Lond). 84(6):1–6, 2023
6- Higuchi C, Otsuki D, Kobayashi M, Yamanaka A, Tamura D, Okada S, Kawabata H. Characteristics of Pyomyositis at a Pediatric Hospital in Osaka, Japan. Cureus. 17(6):e86325, 2025
7- Zeybek H, Cici H, Kiratli K, Abdulkarim OM, Yildiz G, Yildirim C. Primary Purulent Infectious Myositis of the Extremities in Children. J Pediatr Orthop. 2025


PSU Volume 66 No 02 FEBRUARY 2026

Pyloric Atresia and Epidermolysis Bullosa

Pyloric atresia associated with epidermolysis bullosa represents one of the most severe congenital syndromes encountered in neonatal medicine, combining a mechanical obstruction of the gastric outlet with a profound disorder of skin and mucosal integrity. Although pyloric atresia alone accounts for only a small fraction of intestinal atresia, its association with epidermolysis bullosa markedly alters the clinical course, prognosis, and management priorities. This combined condition is rare, typically presenting in the neonatal period, and is characterized by early gastrointestinal obstruction, extensive skin fragility, and a high risk of multisystem complications that frequently culminate in early mortality .


Clinically, affected neonates usually present within the first days of life with non-bilious vomiting, feeding intolerance, and progressive abdominal distension caused by complete obstruction at the level of the pylorus. Radiographic imaging classically demonstrates a markedly distended stomach with absence of distal bowel gas, often referred to as a "single bubble" sign. These findings are often preceded by antenatal clues, particularly polyhydramnios and fetal gastric dilation detected on prenatal ultrasonography, reflecting impaired gastric emptying in utero. At the same time, cutaneous manifestations may be evident at birth or emerge shortly thereafter, including tense bullae, erosions, or areas of congenital skin absence. Even minimal mechanical trauma, such as handling or adhesive application, can provoke new blister formation, underscoring the extreme fragility of the integument in this disorder .


Epidermolysis bullosa with pyloric atresia is now recognized as a genetically determined condition most commonly inherited in an autosomal recessive pattern. The underlying defect involves proteins essential for dermo epidermal adhesion, particularly those associated with hemidesmosomes and the basement membrane zone. Pathogenic variants in genes encoding integrin a6, integrin ß4, and plectin disrupt epithelial stability not only in the skin but also in the gastrointestinal tract, urinary system, and respiratory mucosa. This explains why the disease extends beyond the skin to involve pyloric development, renal structures, and internal epithelial linings. The phenotype varies in severity depending on the nature of the mutation, but many affected infants experience extensive disease with rapid clinical deterioration .


From a pathological standpoint, pyloric atresia in this syndrome may take several anatomical forms, ranging from a thin membranous web to a solid fibrous cord or a complete gap between the stomach and duodenum. These anatomical variations have important implications for surgical management. Less extensive lesions may permit pyloroplasty or excision of a pyloric membrane, whereas more complex forms require bypass procedures such as gastroduodenostomy or gastrojejunal anastomosis. In practice, the choice of operation is often influenced not only by anatomy but also by the infant's overall condition, body size, tissue fragility, and the feasibility of safely mobilizing surrounding structures .


Surgical correction of the pyloric obstruction is essential for survival, yet it does not alter the underlying disease process. Even when surgery is technically successful and early postoperative feeding is achieved, the long-term outcome remains guarded. The postoperative period is frequently complicated by wound breakdown, infection, electrolyte disturbances, and feeding difficulties. Skin trauma during anesthesia, intubation, vascular access, and surgical positioning can lead to widespread blistering and erosions. As a result, meticulous perioperative planning is required, including avoidance of adhesive tapes, careful fixation of tubes, padding of pressure points, and gentle tissue handling. Central venous access is often necessary for nutritional and fluid management, but catheter placement itself carries significant risks in the context of fragile skin and impaired wound healing .


Beyond the gastrointestinal tract and skin, multisystem involvement is common and contributes substantially to morbidity and mortality. Renal and urinary tract anomalies, such as hydronephrosis, dysplastic kidneys, and obstructive uropathy, have been reported with notable frequency. Protein-losing enteropathy may develop due to mucosal fragility within the intestine, leading to chronic diarrhea, hypoalbuminemia, and failure to thrive. Respiratory complications are also prominent, including mucosal blistering of the airway, recurrent aspiration, and severe infections. These complications often interact, producing a cascade of clinical deterioration that is difficult to reverse despite intensive supportive care .


Infectious complications remain a leading cause of death in affected infants. Open skin lesions provide a portal of entry for bacteria, while immune compromise related to malnutrition and chronic inflammation further increases susceptibility. Sepsis may develop rapidly and prove refractory to broad-spectrum antimicrobial therapy. Recurrent pneumonia, whether infectious or aspiration-related, is another frequent terminal event. Even in cases where initial surgical and dermatologic management appears successful, late-onset infections can abruptly worsen the clinical course and lead to fatal outcomes weeks or months after birth .


Diagnostic confirmation relies on a combination of clinical features, imaging, and laboratory evaluation. While the diagnosis of pyloric atresia is usually established radiographically, confirmation of epidermolysis bullosa may involve skin biopsy with ultrastructural or immunofluorescence analysis, as well as molecular genetic testing. In practice, definitive genetic results are often obtained after clinical decisions have already been made, particularly in rapidly progressive cases. Nevertheless, establishing the genetic basis is important for prognostication, family counseling, and future reproductive planning. Prenatal diagnosis may be possible in families with known mutations, allowing informed decision-making and anticipatory perinatal care .


The overall prognosis of epidermolysis bullosa with pyloric atresia remains poor despite advances in neonatal intensive care and surgical techniques. Mortality is highest in the neonatal period, especially among infants with extensive skin involvement, severe mutations, and associated systemic anomalies. A minority of patients survive beyond infancy, and those who do often face chronic medical challenges, including persistent skin disease, nutritional deficiencies, and recurrent infections. Importantly, survival does not necessarily correlate with the success of pyloric surgery alone, emphasizing that the gastrointestinal obstruction is only one component of a broader systemic disorder .


Management therefore requires a coordinated, multidisciplinary approach that balances aggressive supportive care with realistic assessment of prognosis. Surgical correction of pyloric atresia should be accompanied by meticulous dermatologic care, nutritional support, infection surveillance, and careful handling at every stage of treatment. In some cases, early involvement of palliative care services may be appropriate to support families and guide decision-making, particularly when the burden of disease is overwhelming and the likelihood of long-term survival is low. Transparent communication with caregivers about the nature of the condition, expected complications, and potential outcomes is essential throughout the clinical course .


In summary, pyloric atresia associated with epidermolysis bullosa is a devastating congenital syndrome rooted in fundamental defects of epithelial integrity. Its presentation is marked by early gastric outlet obstruction and severe skin fragility, with frequent involvement of multiple organ systems. Although surgical intervention is necessary to relieve pyloric obstruction, it does not address the underlying genetic disease, and survival remains limited by infectious, nutritional, and respiratory complications. Continued recognition of this condition, careful multidisciplinary management, and advances in genetic diagnosis are essential to improving care and supporting affected families, even as the prognosis remains guarded in most cases .


References:
1- Lucky AW, Gorell E. Epidermolysis bullosa with pyloric atresia. In: GeneReviews® [Internet]. Seattle (WA): University of Washington, Seattle; 1993–2025. First published February 22, 2008; updated January 26, 2023.
2- Márquez K, Rodríguez DA, Pérez LA, Duarte M, Zárate LA. Epidermolysis bullosa with pyloric atresia: Report of two cases in consecutive siblings. Biomédica. 41(2):201–207, 2021
3- Pan P. Congenital pyloric atresia and epidermolysis bullosa: Report of a rare association. Journal of Indian Association of Pediatric Surgeons. 26(4):256–258, 2021
4-Luo C, Yang L, Huang Z, Su Y, Lu Y, Yu D, Zhang M, Wu K. Case report: Epidermolysis bullosa complicated with pyloric atresia and a literature review. Frontiers in Pediatrics. 11:1098273, 2023
5- Saleem A, Khan AM, Ahmed M. Pyloric atresia associated with epidermolysis bullosa: A case report. Journal of Ayub Medical College Abbottabad. 36(4):838–840, 2024
6- Sakamoto N, Masumoto K, Aoyama T, Shirane K, Homma Y. Pyloric atresia in a neonate with epidermolysis bullosa: A case report. Clinical Case Reports. 12(12):e9685, 2024

Tailgut Cysts

Tailgut cysts are rare congenital lesions that arise from remnants of the embryonic hindgut that fail to regress during early development. During normal embryogenesis, the tailgut appears transiently as the most distal portion of the primitive gut and typically involutes by the sixth week of gestation. When this involution is incomplete, epithelial remnants persist and may later give rise to cystic lesions in the presacral or retrorectal space. These cysts are also referred to as retrorectal cystic hamartomas and represent a small but clinically significant subset of presacral tumors.


The retrorectal space is anatomically complex and relatively inaccessible, bordered anteriorly by the rectum, posteriorly by the sacrum and coccyx, superiorly by the peritoneal reflection, inferiorly by the pelvic floor musculature, and laterally by major vessels, ureters, and neural structures. Lesions arising in this space may remain clinically silent for years due to its capacity to accommodate slow-growing masses. As a result, tailgut cysts are frequently discovered incidentally during imaging performed for unrelated gynecologic, gastrointestinal, or spinal complaints.


Epidemiologically, tailgut cysts show a marked predominance in females and are most often diagnosed in adults between the third and sixth decades of life, although cases have been reported across all age groups, including children. The reasons for the female predominance remain unclear but may relate to increased detection during pelvic imaging or gynecologic evaluation. Despite their congenital origin, presentation in childhood is uncommon, and pediatric cases are particularly prone to misdiagnosis.


Clinical presentation varies widely. Approximately half of affected individuals are asymptomatic at the time of diagnosis. When symptoms occur, they are typically related to mass effect on adjacent structures. Patients may report constipation, tenesmus, pelvic or rectal pain, dysuria, urinary retention, or a sensation of incomplete evacuation. In women, symptoms may fluctuate with hormonal changes or be confused with gynecologic conditions such as endometriosis. In some cases, pain worsens with prolonged sitting or physical activity, reflecting pressure on sacral nerve roots.


Complications can arise when cysts become infected, rupture, or bleed. Infected tailgut cysts may present as recurrent perianal abscesses, fistulas, or chronic inflammatory masses, often leading to delayed diagnosis and repeated ineffective interventions. One of the most clinically significant concerns associated with tailgut cysts is their potential for malignant transformation. Although historically considered rare, malignant degeneration has been increasingly reported, with transformation into adenocarcinoma, neuroendocrine tumors, or squamous cell carcinoma. This oncologic risk underpins the consensus that complete surgical excision is indicated even in asymptomatic patients.


Radiologic imaging plays a central role in diagnosis and preoperative planning. Magnetic resonance imaging is generally considered the modality of choice due to its superior soft tissue contrast and ability to delineate the relationship between the cyst and surrounding pelvic structures. Tailgut cysts typically appear as well-defined, multiloculated cystic lesions with variable signal intensity depending on their content. High signal intensity on T1-weighted images may reflect mucinous or protein-rich material, while T2-weighted images often demonstrate a hyperintense, multicystic pattern. MRI is particularly valuable in assessing extension above or below the levator ani muscle, involvement of the sacrum or coccyx, and features suggestive of malignancy, such as irregular walls, solid components, or enhancement after contrast administration.


Computed tomography can also be useful, especially when MRI is unavailable, but it is less specific in characterizing cyst contents and soft tissue planes. Ultrasonography may detect cystic masses but is limited in deep pelvic evaluation. Preoperative biopsy is generally discouraged due to the risk of infection, tumor seeding, and limited diagnostic yield, as definitive diagnosis relies on histopathological examination of the resected specimen.


Histologically, tailgut cysts are characterized by a heterogeneous epithelial lining that may include stratified squamous, columnar, transitional, or ciliated epithelium, sometimes within the same lesion. The cyst wall may contain fibrous tissue and smooth muscle but lacks the organized muscular layers and neural plexuses seen in true duplication cysts. This histologic diversity reflects the embryologic origin of the lesion and helps distinguish tailgut cysts from other presacral entities such as dermoid cysts, epidermoid cysts, teratomas, anterior meningoceles, and rectal duplications.


The definitive treatment of tailgut cysts is complete surgical excision with clear margins. The choice of surgical approach depends primarily on the size and location of the lesion, its relationship to the pelvic floor, and suspected involvement of adjacent structures. Lesions located above the level of the levator ani or sacral vertebrae are commonly approached from an anterior, transabdominal route, while those located lower in the presacral or retroanal space may be more accessible via posterior approaches such as the transsacral or parasacrococcygeal route. In selected cases, a combined anterior and posterior approach is required, particularly for large lesions, extensive adhesions, or suspected bony involvement.


Advances in minimally invasive surgery have significantly influenced the management of tailgut cysts. Laparoscopic and robotic techniques allow enhanced visualization, precise dissection in confined pelvic spaces, and improved preservation of nerves and vascular structures. Robotic-assisted surgery, in particular, offers technical advantages such as three-dimensional visualization, articulated instruments, tremor filtration, and improved ergonomics, which are especially valuable in the narrow presacral space. These techniques have been associated with reduced blood loss, shorter hospital stays, and faster recovery compared to traditional open surgery, albeit with longer operative times in some cases.


Despite these advantages, surgical resection of tailgut cysts remains technically demanding. Dense adhesions to the rectum, pelvic floor muscles, or sacrum may be encountered, especially in cases with prior infection or inflammation. Intraoperative cyst rupture can occur and should be managed with immediate evacuation and irrigation to minimize contamination. Injury to the rectal wall, although uncommon, is a recognized risk and requires prompt repair. In selected cases, partial or complete coccygectomy may be necessary to achieve complete excision and reduce recurrence risk.


Postoperative outcomes are generally favorable when complete resection is achieved. Recurrence is rare but may occur following incomplete excision or cyst rupture. Long-term follow-up with clinical evaluation and periodic imaging is advisable, particularly in cases with atypical histologic features or difficult dissections. When malignant transformation is identified, management must be individualized and may involve additional surgery, chemotherapy, or radiotherapy depending on tumor type and stage.


One of the ongoing challenges in the management of tailgut cysts is diagnostic delay. Nonspecific symptoms, rarity of the condition, and overlap with more common pelvic pathologies contribute to misdiagnosis and prolonged patient morbidity. Increased awareness among surgeons, radiologists, and clinicians is essential to ensure timely identification and appropriate referral. A high index of suspicion should be maintained when evaluating cystic lesions in the presacral space, particularly in middle-aged women with unexplained pelvic or rectal symptoms.


In summary, tailgut cysts are uncommon congenital lesions with variable clinical presentation and significant potential for complications, including malignant transformation. Accurate diagnosis relies on high-quality imaging, while definitive management requires complete surgical excision tailored to the lesion's anatomy. Advances in minimally invasive and robotic surgery have expanded the therapeutic options available and improved perioperative outcomes. Given the complexity of the presacral space and the rarity of these lesions, optimal management depends on careful preoperative planning, detailed knowledge of pelvic anatomy, and meticulous surgical technique. Continued recognition of tailgut cysts as a distinct clinical entity is essential to prevent delayed treatment and to ensure favorable long-term outcomes.


References:
1- Rompen IF, Scheiwiller A, Winiger A, Metzger J, Gass JM:  Robotic-Assisted Laparoscopic Resection of Tailgut Cysts. JSLS. 25(3):e2021.00035, 2021
2- Solís-Peña A, Ngu LWS, Kraft Carré M, Gomez Jurado MJ, Vallribera Valls F, Pellino G, Espin-Basany E: Robotic abdominal resection of tailgut cysts – A technical note with step-by-step description. Colorectal Disease. 24(6):793–796, 2022
3- Haval S, Dwivedi D, Nichkaode P: Presacral tailgut cyst. Annals of African Medicine. 23(2):237–241, 2024
4- Shukla R, Patel JD, Chandna SB, Parikh U: Tailgut cyst in a child: A case report and review of literature. African Journal of Paediatric Surgery. ;21(3):184–187, 2024
5- Wojciechowski J, Skolozdrzy T, Wojtasik P, Romanowski M: Two cases of symptomatic tailgut cysts. Journal of Clinical Medicine. 13(17):5136, 2024
6- Abatli S, AlHabil Y, Hamad MS, Abulibdeh Y: Mature cystic teratoma mimicking a tailgut cyst in an adolescent female: A case report. Journal of Surgical Case Reports. (11):rjae719, 2024

Blunt Cerebrovascular Injuries

Blunt cerebrovascular injury represents one of the most elusive and potentially devastating consequences of pediatric trauma. Although relatively infrequent when compared with other traumatic injuries, its clinical importance lies in the disproportionate risk of ischemic stroke, long-term neurologic impairment, and mortality. The challenge in pediatric populations is amplified by anatomical, physiological, and developmental factors that obscure early recognition and complicate diagnostic decision-making. As a result, blunt cerebrovascular injury remains both underdiagnosed and inconsistently managed, despite growing awareness of its clinical relevance.


Blunt cerebrovascular injury refers to nonpenetrating damage to the carotid or vertebral arteries caused by mechanical forces such as hyperextension, hyperflexion, rotation, or direct blunt impact. These forces may produce intimal tears, intramural hematomas, pseudoaneurysm formation, arterial dissection, or complete vessel occlusion. While these injuries may initially remain clinically silent, they carry a significant risk of delayed ischemic stroke, sometimes occurring hours or days after the inciting trauma. This delayed presentation contributes to diagnostic uncertainty and underscores the importance of early identification in at-risk patients.


In children, the incidence of blunt cerebrovascular injury has historically been reported as low, often below one percent of all blunt trauma admissions. However, increasing evidence suggests that this figure may reflect underdiagnosis rather than true rarity. Pediatric patients are less likely to undergo vascular imaging, in part due to concerns about radiation exposure and the absence of validated pediatric screening criteria. As imaging practices evolve and awareness increases, reported incidence rates have risen, with some contemporary cohorts identifying rates approaching or exceeding one percent when systematic screening is applied.


Several anatomical and biomechanical characteristics unique to children influence both injury patterns and detection. A proportionally larger head, weaker cervical musculature, greater ligamentous laxity, and increased elasticity of vascular structures contribute to distinctive injury mechanisms. These features may paradoxically offer some protection against vessel rupture while simultaneously predisposing to stretching and intimal damage. The result is a spectrum of vascular injury that may not produce immediate neurologic signs yet carries a substantial risk for delayed ischemic events.


Motor vehicle collisions remain the most common mechanism associated with pediatric blunt cerebrovascular injury. Within this context, restraint use plays a nuanced role. Proper restraint has been shown to reduce overall injury severity and may lower the risk of vascular injury in younger children. Conversely, improper restraint or high-energy mechanisms can transmit rotational and shearing forces to the cervical vasculature, increasing injury risk. Notably, while cervical seatbelt signs have historically been viewed as red flags, their predictive value for vascular injury in children appears inconsistent, and their absence does not exclude significant pathology.


Beyond mechanism of injury, several anatomical and clinical features have emerged as important predictors. Cervical spine fractures, particularly those involving the upper cervical segments, are among the strongest associated factors. Basilar skull fractures, facial fractures—especially Le Fort–type patterns—and intracranial hemorrhage also demonstrate strong associations. Depressed Glasgow Coma Scale scores and higher overall injury severity scores further increase suspicion. Conversely, isolated soft tissue injuries of the neck, once considered highly suggestive, have shown limited predictive value in pediatric populations.


Despite these associations, no single clinical feature reliably predicts blunt cerebrovascular injury. This has led to the development of screening algorithms intended to identify high-risk patients. Many of these tools were initially developed in adult populations and later extrapolated to children. Unfortunately, when applied to pediatric cohorts, these adult-derived criteria demonstrate limited sensitivity. In some analyses, commonly used screening frameworks identify only a small fraction of affected children, missing a substantial number of cases that ultimately develop cerebrovascular complications.


More recent pediatric-focused screening models have attempted to improve sensitivity by incorporating age-specific injury patterns and mechanisms. When applied consistently, these approaches have increased detection rates, but at the cost of increased imaging utilization. This trade-off highlights the ongoing tension between minimizing radiation exposure and preventing devastating neurologic outcomes. Importantly, studies implementing structured screening protocols have demonstrated higher detection rates than historical controls, suggesting that underdiagnosis remains a central concern.


Imaging modality selection remains another critical consideration. Computed tomographic angiography has become the primary diagnostic tool due to its availability and rapid acquisition. However, its sensitivity in detecting subtle intimal injuries is imperfect, particularly in children. While specificity is generally high, false-negative results still occur. Digital subtraction angiography remains the gold standard but is invasive and rarely used as a first-line modality in pediatric trauma. Magnetic resonance angiography offers a radiation-free alternative, although its availability and feasibility in acute settings are limited. Consequently, clinical judgment continues to play a decisive role in determining when imaging is warranted.


Once identified, management strategies for blunt cerebrovascular injury in children largely mirror those used in adults, despite the lack of pediatric-specific outcome data. Antithrombotic therapy—either antiplatelet agents or anticoagulation—constitutes the cornerstone of treatment for most low- to moderate-grade injuries. Surgical or endovascular interventions are reserved for select cases involving high-grade lesions, progressive neurologic deficits, or failure of medical therapy. Observation alone may be appropriate in select low-risk cases, particularly when bleeding risk or concomitant injuries limit pharmacologic intervention.


Outcomes in pediatric patients appear comparable to those observed in adults when injuries are identified and treated promptly. Stroke remains the most feared complication and may occur even after diagnosis and initiation of therapy, although its incidence decreases significantly with early recognition. Reported stroke rates vary across studies, reflecting differences in screening intensity, diagnostic thresholds, and follow-up practices. Importantly, pediatric patients often demonstrate favorable neurological recovery compared with adults, potentially reflecting greater neuroplasticity.


Despite these advances, management remains inconsistent across institutions. Treatment strategies vary widely with respect to medication choice, duration of therapy, and follow-up imaging. Some children discontinue antithrombotic therapy prematurely, while others remain on prolonged treatment without clear evidence-based guidance. These inconsistencies underscore the need for standardized pediatric-specific protocols informed by prospective, multicenter data.


Comparative analyses between pediatric and adult populations reveal both similarities and distinctions. Injury mechanisms and vascular territories involved are broadly comparable, yet children tend to present with higher injury severity scores and more frequent carotid involvement, whereas vertebral artery injuries appear more common in adults. Despite these differences, overall outcomes—including stroke rates and mortality—are largely similar when comparable management strategies are applied. This suggests that adult-derived treatment frameworks may be pragmatically applied to children, though they are not ideal substitutes for pediatric-specific guidelines.


In summary, blunt cerebrovascular injury in children represents a complex and often underrecognized consequence of blunt trauma. Its detection is hindered by subtle clinical presentation, variable risk factors, and limitations of existing screening tools. Recognition of high-risk mechanisms and injury patterns, combined with judicious use of imaging and timely therapeutic intervention, can significantly mitigate the risk of catastrophic neurologic outcomes. Continued research and collaborative efforts are essential to refine screening strategies, optimize management, and ultimately improve outcomes for this vulnerable population.


References:
1- Farzaneh CA, Schomberg J, Sullivan BG, Guner YS, Nance ML, Gibbs D, Yu PT: Development and validation of machine learning models for the prediction of blunt cerebrovascular injury in children. Journal of Pediatric Surgery. 57(4):732–738, 2022
2- El Tawil C, Nemeth J, Al Sawafi M: Pediatric blunt cerebrovascular injuries: Approach and management. Pediatric Emergency Care. 40(4):319–322, 2024
3- Nickoles TA, Lewit RA, Notrica DM, Ryan M, Johnson J, Maxson RT, Naiditch JA, Lawson KA, Temkit M, Padilla B, Eubanks JW III: Lower incidence of blunt cerebrovascular injury among young, properly restrained children: An ATOMAC multicenter study. Journal of Trauma and Acute Care Surgery. 95(3):334–340, 2023
4- Schulz M, Weihing V, Shah MN, Cox CS Jr, Ugalde I: Risk factors for blunt cerebrovascular injury in the pediatric patient: A systematic review.
American Journal of Emergency Medicine. 71:37–46, 2023
5- Lewit RA, Nickoles TA, Williams R, Notrica DM, Stottlemyre RL, Ryan M, Johnson JJ, Naiditch JA, Lawson KA, Maxson RT, Grimes S, Eubanks JW III:  Blunt cerebrovascular injury in children: A prospective multicenter ATOMAC+ study. Journal of Trauma and Acute Care Surgery. 99(2):245–252, 2025
6- Asaadi S, Rosenthal MG, Radulescu A, Mukherjee K, Luo-Owen X, Dubose JJ, Tabrizi MB; AAST PROOVIT Study Group:  Pediatric versus adult blunt cerebrovascular injuries: Patient characteristics, management, and outcomes. Annals of Vascular Surgery. 116:1–8, 2025


PSU Volume 66 No 03 MARCH 2026

Cannabinoid Hyperemesis Syndrome

Cannabis has long occupied an unusual position in medicine and culture. For centuries it has been associated with relief—of pain, anxiety, nausea, and loss of appetite. In modern clinical practice, cannabinoids are frequently invoked as antiemetics, particularly in chemotherapy-induced nausea and vomiting. Yet over the past two decades, an unsettling paradox has emerged: in a subset of chronic users, cannabis appears to provoke the very symptoms it is known to suppress. Cannabinoid Hyperemesis Syndrome (CHS) is the name given to this contradiction, and its increasing prevalence reflects both changing patterns of cannabis use and the evolving potency of the substance itself.


CHS is characterized by recurrent episodes of severe nausea, vomiting, and abdominal pain in the setting of chronic cannabis exposure. Patients are often young, otherwise healthy, and deeply familiar with emergency departments long before a diagnosis is made. What distinguishes CHS from other causes of cyclic vomiting is not a laboratory test or imaging finding, but a constellation of behaviors, histories, and responses that only become coherent when cannabis use is examined honestly and longitudinally.


The syndrome often unfolds in phases. In the prodromal period, patients experience early-morning nausea, vague epigastric discomfort, and a growing fear of vomiting. Appetite may decline, but cannabis use frequently increases, driven by the belief that it will alleviate symptoms. This phase can persist for months or years, often unnoticed or misattributed to anxiety, gastritis, or functional gastrointestinal disorders. Over time, however, the illness progresses into a hyperemetic phase marked by relentless vomiting, abdominal pain, dehydration, electrolyte disturbances, and repeated hospital visits. Vomiting may occur dozens of times per day, leading to acute kidney injury, metabolic derangements, and profound physical exhaustion.


One of the most striking features of CHS is the compulsive use of hot showers or baths for symptomatic relief. Patients often describe standing under scalding water for prolonged periods, sometimes multiple times a day, as the only intervention that provides even transient comfort. This behavior is so characteristic that its presence strongly supports the diagnosis, yet it is frequently overlooked or dismissed as incidental. The relief appears to be mediated through cutaneous heat activation rather than psychological comfort, suggesting a neurophysiologic mechanism rather than a learned coping strategy.


The pathophysiology of CHS remains incompletely understood, but several converging mechanisms have been proposed. Chronic exposure to delta-9-tetrahydrocannabinol (THC) appears to alter cannabinoid receptor signaling, particularly at the CB1 receptor, which plays a central role in gastrointestinal motility, visceral sensation, and emesis control. With sustained stimulation, these receptors may become dysregulated or desensitized, leading to a paradoxical proemetic effect. THC also interacts with dopamine and serotonin pathways, both of which are intimately involved in nausea and vomiting. Over time, these interactions may shift from inhibitory to excitatory, especially in susceptible individuals.


Another important pathway involves the transient receptor potential vanilloid 1 (TRPV1) receptor, commonly known as the capsaicin receptor. TRPV1 is activated by heat and capsaicin and plays a role in pain perception and autonomic regulation. Chronic cannabis use appears to overstimulate TRPV1 receptors centrally while impairing their peripheral modulation, leading to splanchnic vasodilation, nausea, and abdominal pain. External heat or topical capsaicin may temporarily restore balance by activating peripheral TRPV1 receptors, explaining both the compulsive hot bathing behavior and the emerging role of capsaicin cream as a therapeutic adjunct.

Clinically, CHS presents a diagnostic challenge because it closely resembles cyclic vomiting syndrome (CVS), a disorder of gut–brain interaction that predates the recognition of CHS by more than a century. Both conditions feature episodic vomiting with symptom-free intervals, abdominal pain, and significant morbidity. The key distinction lies in the temporal relationship between cannabis use and symptom onset, as well as the resolution of symptoms with sustained abstinence. Unfortunately, this distinction is often blurred because patients with CVS may use cannabis to self-medicate, and patients with CHS frequently deny or underreport use, either due to stigma or genuine disbelief that cannabis could be the cause.

Laboratory and imaging studies in CHS are typically nonspecific. Mild leukocytosis, hypokalemia, metabolic alkalosis, and elevated creatinine from dehydration are common but not diagnostic. Imaging studies are often normal and rarely change management, yet they are frequently repeated as clinicians search for structural explanations. The absence of definitive tests contributes to diagnostic delay and unnecessary healthcare utilization, reinforcing patient frustration and clinician uncertainty.

Acute management of CHS focuses on supportive care. Intravenous fluids are essential to correct dehydration and electrolyte abnormalities. Traditional antiemetics such as ondansetron or promethazine may provide partial relief but are often ineffective. Dopamine antagonists, particularly those that act centrally, have demonstrated greater efficacy in controlling symptoms, though they require careful monitoring due to potential cardiac and extrapyramidal side effects. Benzodiazepines may be helpful in select cases, especially when anxiety exacerbates symptoms, but they do not address the underlying mechanism. Topical capsaicin applied to the abdomen has emerged as a low-cost, low-risk intervention that can reduce nausea and vomiting by exploiting TRPV1-mediated pathways.


Despite these measures, the only definitive treatment for CHS is complete cessation of cannabis use. Symptom resolution typically occurs within days to weeks of abstinence, though residual nausea may persist as THC is slowly released from adipose tissue. Relapse is common if cannabis use resumes, often with a shorter latency and more severe symptoms. This pattern underscores the importance of recognizing CHS not only as a gastrointestinal disorder but also as a condition intertwined with substance use behavior, mental health, and social context.


The chronic phase of management therefore extends beyond the emergency department or hospital ward. Patients require education that reframes cannabis not as a remedy but as a trigger. This conversation is often difficult, particularly in an era when cannabis is widely perceived as benign or therapeutic. Many patients express disbelief, anger, or grief when confronted with the diagnosis, especially if cannabis has played a central role in their identity, coping strategies, or social environment. Addressing comorbid anxiety, depression, and substance use disorder is critical to sustained recovery, as these conditions frequently drive continued use despite clear consequences.


CHS is not a benign syndrome. Repeated episodes of severe vomiting can lead to esophageal injury, aspiration, acute renal failure, and life-threatening electrolyte disturbances. Prolonged QT intervals, particularly in the context of antiemetic use, increase the risk of malignant arrhythmias. The economic burden is substantial, driven by repeated emergency visits, hospitalizations, diagnostic testing, and lost productivity. Yet despite its growing prevalence, CHS remains underrecognized, underdiagnosed, and often misunderstood.


The increasing legalization and commercialization of cannabis have altered both the frequency and intensity of exposure. Modern cannabis products often contain significantly higher concentrations of THC than those used in prior decades, and new delivery systems allow for rapid, repeated dosing. These changes may partially explain why CHS is being identified more frequently and at younger ages. At the same time, cultural narratives surrounding cannabis as a natural or harmless substance may delay recognition of its adverse effects, both by patients and clinicians.


Understanding CHS requires abandoning simple binaries of "good" or "bad" drugs and embracing a more nuanced view of dose, duration, individual susceptibility, and neurobiology. Cannabis can be both antiemetic and emetogenic, therapeutic and toxic, depending on context. CHS occupies the uncomfortable space where these contradictions converge, reminding clinicians that physiology does not always conform to expectation or intention.


As awareness grows, earlier recognition of CHS offers the possibility of reducing harm, avoiding unnecessary testing, and guiding patients toward effective treatment. Doing so requires careful listening, nonjudgmental inquiry into substance use, and a willingness to question assumptions—both the patient's and the clinician's. In this sense, CHS is not only a medical syndrome but also a lesson in clinical humility: a reminder that even familiar remedies can betray us when used without limits, and that relief, like illness, often carries a history we must learn to read.


References:
1- Lonsdale H, Wilsey MJ: Paediatric cannabinoid hyperemesis. Current Opinion in Pediatrics. 34(5):510–515, 2022
2- Geraci E, Cake C, Mulieri KM, Fenn NE 3rd: Comparison of antiemetics in the management of pediatric cannabinoid hyperemesis syndrome. Journal of Pediatric Pharmacology and Therapeutics. 28(3):222–227, 2023
3- Shah M, Jergel A, George RP, Jenkins E, Bashaw H: Distinguishing clinical features of cannabinoid hyperemesis syndrome and cyclic vomiting syndrome: A retrospective cohort study. The Journal of Pediatrics. 271:114054, 2024
4- Ibia IE, Toce MS: Cannabis hyperemesis syndrome in children: A review of epidemiology, pathology, diagnosis, and treatment. Pediatric Emergency Care. 41(5):397–405, 2025
5- Meyer J, Burns MM: Current recommendations in the diagnosis and management of cannabinoid hyperemesis syndrome. Current Opinion in Pediatrics. 37(3):240–243, 2025
6- Yacob D: Cyclic vomiting syndrome and cannabinoid hyperemesis syndrome: Their intersection and joint existence. Gastroenterology Clinics of North America. 54(3):557–568, 2025

Non-Operative Management of Appendicitis

Acute appendicitis remains one of the most common surgical emergencies worldwide, traditionally managed by appendectomy as definitive therapy. For more than a century, early surgical removal of the appendix was justified by the belief that appendicitis represents a progressive disease that inevitably leads to perforation if left untreated. However, advances in diagnostic imaging, antimicrobial therapy, and a growing body of clinical evidence have challenged this paradigm, giving rise to renewed interest in non-operative management using antibiotics alone, particularly in cases of uncomplicated appendicitis.


The conceptual shift underlying non-operative management is rooted in the recognition that appendicitis may not represent a single disease process. Instead, it appears to encompass a spectrum ranging from mild, self-limited inflammation to severe gangrenous or perforated disease. This distinction has profound implications for treatment strategies. Uncomplicated appendicitis, characterized by localized inflammation without perforation, abscess, or phlegmon, has emerged as a potential target for conservative treatment. The increasing use of high-resolution ultrasound and computed tomography has improved diagnostic accuracy, enabling clinicians to more reliably identify patients who may be suitable for non-operative approaches.


Across adult and pediatric populations, antibiotic-first strategies have demonstrated high rates of initial clinical success. Most patients experience symptom resolution during the index admission without the need for urgent surgery. These findings suggest that, in selected patients, acute appendicitis can be effectively controlled with antimicrobial therapy, avoiding the immediate risks associated with anesthesia and surgery. Moreover, the observation that many patients do not experience disease progression despite delayed or absent surgical intervention has further weakened the long-held assumption that appendicitis is uniformly progressive.


Despite these encouraging early outcomes, the long-term durability of non-operative management remains a central concern. Recurrence of appendicitis or failure of antibiotic therapy requiring appendectomy is consistently reported during follow-up, particularly within the first year. While a substantial proportion of patients avoid surgery altogether, cumulative failure rates increase over time, resulting in a significant minority ultimately undergoing appendectomy. This pattern underscores an important distinction between short-term treatment success and definitive cure. From a clinical perspective, non-operative management may be best understood not as a replacement for surgery, but as an alternative initial strategy that defers or potentially avoids operative intervention.


Complication profiles associated with non-operative and operative management differ in nature rather than magnitude. Appendectomy, even when performed laparoscopically, carries risks related to anesthesia, surgical site infection, postoperative pain, and, in rare cases, more serious adverse events. However, contemporary surgical techniques have markedly reduced morbidity, and appendectomy remains one of the safest emergency operations performed in both adults and children. In contrast, non-operative management avoids surgical risks but introduces others, including antibiotic-related adverse effects, increased rates of unplanned healthcare visits, and the psychological burden associated with recurrence risk. Importantly, available evidence suggests that delayed appendectomy following failed non-operative treatment does not result in a substantially higher rate of severe complications when appropriate monitoring and timely intervention are ensured.


Length of hospital stay has been widely examined as a comparative outcome between treatment strategies. Contrary to the perception that conservative management necessarily shortens hospitalization, antibiotic-based treatment often requires prolonged observation and intravenous therapy, leading to longer initial hospital stays than early appendectomy. Surgical management, particularly when minimally invasive, offers predictable postoperative recovery and discharge timelines. Nevertheless, some patients treated non-operatively may resume normal activities sooner and require less postoperative analgesia, highlighting that hospital length of stay alone does not fully capture functional recovery.


The presence of an appendicolith has emerged as a critical predictor of non-operative treatment failure. Patients with appendicoliths consistently demonstrate higher rates of recurrence, complications, and subsequent appendectomy when managed with antibiotics alone. This finding supports the hypothesis that luminal obstruction plays a key role in disease persistence and progression in a subset of patients. As a result, many contemporary protocols exclude patients with appendicoliths from non-operative management, emphasizing the importance of careful patient selection based on imaging findings.


In pediatric populations, the debate surrounding non-operative management is particularly nuanced. Children generally tolerate appendectomy well, with low complication rates and excellent long-term outcomes. At the same time, avoidance of surgery may be appealing to families seeking to minimize procedural intervention, postoperative pain, or school absence. Evidence in children demonstrates that non-operative management is safe in the short term, with no increase in mortality or severe morbidity. However, non-inferiority to appendectomy has not been consistently demonstrated when long-term failure rates are considered. A substantial proportion of children initially treated with antibiotics ultimately require appendectomy, raising questions about the overall effectiveness of conservative management in this population.


Quality of life considerations further complicate treatment decisions. Patients managed non-operatively often report less pain and reduced use of analgesics in the early phase, as well as faster return to daily activities. Conversely, the uncertainty associated with recurrence risk and the need for ongoing vigilance may negatively impact long-term quality of life for some patients and families. Appendectomy, while associated with short-term postoperative discomfort, offers definitive resolution and eliminates the risk of recurrence. These contrasting experiences highlight the importance of incorporating patient and family preferences into shared decision-making processes.


From a healthcare system perspective, non-operative management offers both potential benefits and challenges. Reduced operative volume may alleviate surgical workload and resource utilization, particularly in settings with limited operating room availability. However, increased rates of emergency department visits, readmissions, and delayed surgery may offset these advantages. Economic analyses remain heterogeneous, reflecting differences in healthcare delivery models, antibiotic protocols, and follow-up practices.


Taken together, current evidence supports non-operative management as a safe and feasible option for carefully selected patients with uncomplicated appendicitis, particularly in the absence of appendicoliths and when reliable follow-up can be ensured. Nonetheless, appendectomy remains the most definitive treatment, with the highest likelihood of permanent resolution and predictable outcomes. Rather than framing these strategies as competing approaches, contemporary practice increasingly recognizes them as complementary options within a patient-centered framework.


Future research should focus on refining selection criteria, identifying biomarkers predictive of sustained response to antibiotics, and standardizing treatment protocols. Long-term outcome data extending beyond one year are essential to better define true treatment effectiveness. Additionally, greater emphasis on patient-reported outcomes will enhance understanding of how different management strategies impact quality of life.


In conclusion, non-operative management represents a significant evolution in the treatment of acute appendicitis. While it challenges long-standing surgical dogma, its role is best defined as an individualized option rather than a universal substitute for appendectomy. Ongoing evidence continues to shape a more nuanced, personalized approach to appendicitis care, balancing efficacy, safety, patient preference, and healthcare system considerations.


References:
1- Jumah S, Wester T: Non-operative management of acute appendicitis in children. Pediatric Surgery International. 39(1):11, 2022
2- Zagales I, Sauder M, Selvakumar S, Spardy J, Santos RG, Cruz J, Bilski T, Elkbuli A: Comparing outcomes of appendectomy versus non-operative antibiotic therapy for acute appendicitis: A systematic review and meta-analysis of randomized clinical trials. The American Surgeon. 89(6):2644–2655, 2023
3- Decker E, Ndzi A, Kenny S, Harwood R: Systematic review and meta-analysis to compare the short- and long-term outcomes of non-operative management with early operative management of simple appendicitis in children after the COVID-19 pandemic. Journal of Pediatric Surgery. 59(6):1050–1057, 2024
4- Adams SE, Perera MRS, Fung S, Maxton J, Karpelowsky J: Non-operative management of uncomplicated appendicitis in children: A randomized, controlled, non-inferiority study evaluating safety and efficacy. ANZ Journal of Surgery. 94(9):1569–1577, 2024
5- St Peter SD, Noel-MacDonnell JR, Hall NJ, Eaton S, Suominen JS, Wester T, Svensson JF, Almström M, Muenks EP, Beaudin M, Piché N, Brindle M, MacRobie A, Keijzer R, Engstrand Lilja H, Kassa AM, Jancelewicz T, Butter A, Davidson J, Skarsgard E, Te-Lu Y, Nah S, Willan AR, Pierro A: Appendicectomy versus antibiotics for acute uncomplicated appendicitis in children: An open-label, international, multicentre, randomised, non-inferiority trial. The Lancet. 405:233–240, 2025
6- Brucchi F, Filisetti C, Luconi E, Fugazzola P, Cattaneo D, Ansaloni L, Zuccotti G, Ferraro S, Danelli P, Pelizzo G: Non-operative management of uncomplicated appendicitis in children, why not? A meta-analysis of randomized controlled trials. World Journal of Emergency Surgery. 20:25, 2025

Pediatric Crotalid Snakebites

Pediatric crotalid snakebites represent a distinct but well-characterized subset of venomous injuries in the United States, accounting for a substantial proportion of snakebite-related morbidity in children. Crotalid snakes, which include rattlesnakes, copperheads, and cottonmouths, are responsible for the vast majority of venomous snake envenomation nationwide. Although children differ physiologically from adults, accumulated evidence indicates that the clinical course, systemic toxicity, and outcomes of pediatric crotalid envenomation closely parallel those observed in adults, with important nuances related to venom effects, laboratory abnormalities, and patterns of care .


Envenomation typically results from defensive bites and most often involves the extremities. Lower extremity bites predominate overall, particularly in younger children, whereas upper extremity bites are more common in older children and adolescents, reflecting behavioral and environmental exposure patterns. Local manifestations are nearly universal and include pain, edema, erythema, and ecchymosis, which may progress proximally from the bite site. Tissue necrosis and blistering occur less frequently and, when present, are often associated with delayed presentation or more severe envenomation. Importantly, after adjusting for bite location, the likelihood of necrosis does not differ substantially between pediatric and adult patients, underscoring that venom dose and composition rather than patient size are key determinants of local tissue injury .


Systemic toxicity is a defining concern in crotalid envenomation and is primarily hematologic in nature. Venom-induced coagulopathy, hypofibrinogenemia, and thrombocytopenia result from consumption and degradation of clotting factors mediated by venom metalloproteinases and other enzymes. Pediatric patients demonstrate early hematologic abnormalities at rates comparable to or slightly higher than adults, particularly with respect to hypofibrinogenemia and prolonged coagulation parameters during the initial phase of care. However, late or recurrent hematologic toxicity, which may occur after apparent initial control, develops at similar frequencies in children and adults and rarely leads to clinically significant bleeding when appropriately monitored and treated .


Geographic and climatic factors influence the epidemiology and severity of pediatric snakebites. Children bitten in semi-arid regions are more likely to encounter rattlesnakes, present earlier to care, and require higher levels of monitoring and antivenom administration compared with those in subtropical regions, where copperhead bites are more common. These regional differences translate into longer hospital stays, increased intensive care utilization, and higher antivenom dosing in high-risk environments, despite similar rates of laboratory abnormalities and overall survival . Notably, mortality from pediatric crotalid envenomation remains exceedingly rare in modern series.


Antivenom therapy is the cornerstone of treatment for moderate to severe envenomation and is administered based on clinical progression rather than patient age or weight. Ovine-derived Crotalidae polyvalent immune Fab has become the most widely used antivenom and has demonstrated a favorable safety profile in children. Acute hypersensitivity reactions, historically a major concern with older whole IgG antivenoms, are uncommon with Fab-based products. Large pediatric cohorts have reported no acute hypersensitivity reactions during or shortly after infusion, even among patients requiring intensive care and relatively high cumulative doses. Delayed complications such as recurrent coagulopathy may occur but are not directly attributable to allergic mechanisms and instead reflect the pharmacokinetics of venom and antivenom interactions .


Despite its efficacy, antivenom use varies widely, particularly in copperhead envenomation, which is often milder and may be self-limited. Younger age, upper extremity bites, progression of local tissue effects across major joints, and the presence of comorbidities have all been associated with increased likelihood of antivenom administration. These practice variations highlight ongoing controversy regarding optimal thresholds for treatment and emphasize the need for standardized, evidence-based decision tools to balance benefits, risks, and resource utilization .


In response to variability in care, pediatric-specific management strategies have been developed to better align treatment intensity with clinical severity. The Pediatric Crotalid Envenomation Score integrates physical examination findings and basic coagulation laboratory values to stratify patients into severity tiers that guide admission level and antivenom dosing. Implementation of such structured guidelines has been associated with significant reductions in intensive care admissions and ICU length of stay, without increases in hospital length of stay, readmissions, or adverse outcomes. Importantly, these protocols preserve excellent clinical results while conserving critical resources and reducing unnecessary exposure to antivenom in children with mild envenomation .


Overall outcomes in pediatric crotalid snakebites are favorable when modern supportive care, timely antivenom administration, and appropriate monitoring are employed. Surgical intervention is rarely required and is typically limited to selected cases involving compartment syndrome or significant tissue compromise. Long-term functional impairment is uncommon, and most children recover fully with minimal residual effects. The growing body of pediatric-focused evidence reinforces that children should not be managed more aggressively solely because of age or size; rather, they should be treated according to objective clinical and laboratory indicators of venom effect.


In summary, pediatric crotalid snakebites produce a spectrum of local and systemic effects that closely resemble those seen in adults. Early hematologic abnormalities may be more prominent in children, but overall severity, late toxicity, and outcomes are similar across age groups. Antivenom therapy is safe and effective in pediatric patients, with a very low incidence of hypersensitivity reactions. Regional differences in snake species and exposure patterns influence resource utilization, underscoring the importance of context-specific preparedness. The adoption of pediatric-specific severity scoring systems and treatment guidelines represents an important advance, enabling high-quality, efficient care while maintaining excellent outcomes for children affected by crotalid envenomation.


References:
1- Levine M, Ruha AM, Wolk B, Caravati M, Brent J, Campleman S, Wax P; ToxIC North American Snakebite Study Group: When It Comes to Snakebites, Kids Are Little Adults: a Comparison of Adults and Children with Rattlesnake Bites. J Med Toxicol. 16(4):444–451, 2020
2- Chotai PN, Watlington J, Lewis S, Pyo T, Abdelgawad AA, Huang EY: Pediatric Snakebites: Comparing Patients in Two Geographic Locations in the United States. J Surg Res. 265:297–302, 2021
3- Corbett B, Otter J, Masom CP, Clark RF: Prevalence of Acute Hypersensitivity Reactions in Pediatric Patients Receiving Crotalidae Polyvalent Immune Fab. J Med Toxicol. 17(1):48–50, 2021
4- Ramirez-Cueva F, Larsen A, Knowlton E, Baab K, Rainey Kiehl R, Hendrix A, Condren M, Woslager M: Predictors of FabAV use in copperhead envenomation. Clin Toxicol (Phila). 60(5):609–614, 2022
5-Malek AJ, Criscitiello AA, Nes EK, Regner JL, Zamin SA, Wills HE, Little DC, Stagg HW: Development of the pediatric Crotalid envenomation score guideline and its influence on resource utilization. J Pediatr Surg. 61(1):162549, 2026


PSU Volume 66 No 04 APRIL 2026

US-Guided Subclavian Cannulation

Central venous access remains a cornerstone of modern critical care, anesthesiology, emergency medicine, pediatrics, and long-term infusion therapy. Among available access sites, the subclavian venous system has historically been favored because of lower infection rates, improved patient comfort, and reliable catheter stability. However, traditional landmark-based subclavian cannulation has long been associated with concerns about mechanical complications, particularly pneumothorax and arterial injury. The integration of real-time ultrasound guidance has fundamentally altered this risk-benefit balance, enabling safer visualization, higher success rates, and renewed clinical interest in subclavian access.


Ultrasound-guided subclavian cannulation represents not merely a technical modification of an older procedure, but a conceptual shift in how clinicians approach central venous access. By transforming a "blind" technique into a visual, anatomy-driven intervention, ultrasound allows dynamic assessment of vascular patency, anatomic variation, and needle trajectory. This evolution is particularly relevant in patients with altered anatomy, prior catheterization, coagulopathy, hypovolemia, or in populations such as neonates and children where margins for error are narrow.


A fundamental advantage of ultrasound guidance lies in its ability to identify individual anatomic variability. The subclavian vein may differ substantially in depth, diameter, and spatial relationship to the artery, pleura, and clavicle. Landmark techniques cannot reliably account for these variations, whereas ultrasound permits direct visualization before and during needle advancement. Preprocedural scanning allows confirmation of venous patency, exclusion of thrombosis, and selection of the safest puncture site, while real-time imaging enables continuous monitoring of needle position relative to critical structures.


Modern ultrasound-guided subclavian cannulation is most commonly performed using either an infraclavicular or supraclavicular approach. In the infraclavicular technique, the vein is visualized laterally where it anatomically corresponds to the proximal axillary vein, a location that offers improved ultrasound windows and increased distance from the pleural dome. This distinction is clinically important: while the term "subclavian cannulation" remains widely used, the actual puncture site in many ultrasound-guided approaches is anatomically axillary, a clarification that has implications for procedural standardization, safety comparisons, and educational accuracy. Failure to recognize this distinction may obscure meaningful differences between techniques and complicate interpretation of outcomes .


From a technical standpoint, ultrasound guidance can be applied using short-axis (out-of-plane), long-axis (in-plane), or oblique approaches. Each method carries distinct advantages. Short-axis views provide excellent visualization of surrounding anatomy and are often easier for less experienced operators, while long-axis views allow continuous visualization of the needle shaft and tip, reducing the risk of posterior wall penetration. The oblique approach seeks to combine the benefits of both, though it requires greater operator experience. Current evidence does not conclusively favor one approach over another, underscoring the importance of operator familiarity and consistent training rather than rigid technique selection.


Accumulating clinical data demonstrate that ultrasound-guided subclavian cannulation improves procedural safety compared with landmark techniques. Reductions in arterial puncture, hematoma formation, and pneumothorax have been consistently observed, particularly when real-time guidance is employed. While the magnitude of benefit may be smaller than that seen with internal jugular access, the absolute reduction in serious complications is clinically meaningful, especially given the advantages of subclavian catheter placement for long-term use. Importantly, success rates with ultrasound guidance approach those achieved with jugular access, challenging the perception that the subclavian route is inherently more difficult or dangerous .


One of the most compelling expansions of ultrasound-guided subclavian cannulation has occurred in neonatal and pediatric care. In low birth weight and very low birth weight infants, central venous access is often required when umbilical or peripheral routes are unavailable or inadequate. Ultrasound-guided supraclavicular subclavian cannulation has demonstrated high success rates even in infants weighing less than 1,500 grams, with remarkably low complication profiles. Visualization of the vein, pleura, and adjacent structures allows precise needle control in patients for whom landmark-based techniques would be prohibitively risky. These findings reinforce the role of ultrasound not only as a safety adjunct, but as an enabler of access strategies previously considered impractical in fragile populations .


Ultrasound guidance also expands available options when conventional access sites are exhausted. In patients with venous thrombosis, stenosis, or prior catheter-related injury, alternative routes such as the supraclavicular approach to the brachiocephalic vein can be employed under direct visualization. This adaptability is particularly valuable for tunneled catheters and long-term devices, where preservation of remaining venous access is critical. Real-time ultrasound allows these alternative approaches to be executed with precision, minimizing repeated failed attempts and associated complications .


Comparative evidence has further highlighted the role of ultrasound-guided axillary vein cannulation as a safer alternative to landmark-guided subclavian access. Because the axillary vein lies entirely outside the thoracic cavity, ultrasound-guided puncture at this site preserves the benefits of subclavian catheterization while significantly reducing the risk of pneumothorax and hemothorax. Meta-analytic data indicate higher first-pass success rates and lower mechanical complication rates with ultrasound-guided axillary access compared to landmark subclavian techniques, reinforcing the value of ultrasound in redefining what is traditionally labeled as "subclavian" cannulation .


Despite these advances, widespread adoption of ultrasound-guided subclavian cannulation has been hindered by training gaps. As landmark techniques fell out of favor and subclavian access was deprioritized, procedural experience declined among clinicians and trainees. Ultrasound guidance alone does not eliminate the need for deliberate skill acquisition. High-fidelity simulation models and structured curricula have emerged as effective tools to restore competency, allowing practitioners to rehearse needle visualization, probe manipulation, and complication management in a controlled environment. Simulation-based education is particularly valuable for maintaining proficiency in high-risk, low-frequency procedures and has demonstrated strong face validity among experienced clinicians .


From an educational perspective, ultrasound-guided subclavian cannulation demands integration of cognitive anatomy, image interpretation, and psychomotor coordination. Successful performance requires understanding not only vascular anatomy, but also the dynamic relationship between probe orientation, needle angle, and ultrasound artifacts. Structured training programs that emphasize image acquisition, needle tracking, and error recognition are essential to translate theoretical safety benefits into real-world outcomes.


In clinical practice, ultrasound-guided subclavian cannulation should be viewed as a complementary skill rather than a competing alternative to internal jugular or femoral access. Patient-specific factors—including infection risk, duration of therapy, mobility needs, and existing vascular access—should guide site selection. Ultrasound expands the clinician's ability to tailor access strategies to individual patients, rather than defaulting to a single approach based on habit or perceived ease.


In conclusion, ultrasound-guided subclavian cannulation represents a mature, evidence-supported technique that reconciles the historical advantages of subclavian access with modern safety standards. By enabling real-time visualization, accommodating anatomic variability, and expanding access options across adult, pediatric, and neonatal populations, ultrasound has transformed subclavian cannulation from a high-risk procedure into a controlled, reproducible intervention. Continued emphasis on precise terminology, structured training, and simulation-based education will be essential to fully integrate this technique into routine clinical practice and to ensure that its benefits are consistently realized.


References:
1- Lausten-Thomsen U, Merchaoui Z, Dubois C, Eleni Dit Trolli S, Le Saché N, Mokhtari M, Tissières P: Ultrasound-Guided Subclavian Vein Cannulation in Low Birth Weight Neonates. Pediatric Critical Care Medicine. 18(2):172–175, 2017
2- Saugel B, Scheeren TWL, Teboul JL: Ultrasound-guided central venous catheter placement: a structured review and recommendations for clinical practice. Critical Care. 21(1):225, 2017
3- Yamamoto T, Arai Y, Schindler E: Real-time ultrasound-guided supraclavicular technique as a possible alternative approach for Hickman catheter implantation. Journal of Pediatric Surgery. 55(6):1157–1161, 2020
4- Davies TW, Montgomery H, Gilbert-Kawai E: Cannulation of the subclavian vein using real-time ultrasound guidance. Journal of the Intensive Care Society. 21(4):349–354, 2020
5- Zhou J, Wu L, Zhang C, Wang J, Liu Y, Ping L: Ultrasound guided axillary vein catheterization versus subclavian vein cannulation with landmark technique: A PRISMA-compliant systematic review and meta-analysis. Medicine (Baltimore). 101(43):e31509, 2022
6- Tanwani J, Nabecker S, Hiansen JQ, Mashari A, Siddiqui N, Arzola C, Goffi A, Peacock S: Use of a Novel Three-Dimensional Model to Teach Ultrasound-guided Subclavian Vein Cannulation. ATS Scholar. 4(3):344–353, 2023
7- Gawda R, Czarnik T: Ultrasound-guided infraclavicular cannulation of the subclavian vein – still an ongoing misconception. Journal of the Intensive Care Society. 24(3 Suppl):10, 2023

Language Concordant Clinic

Modern health care unfolds in an increasingly multilingual world. Millions of patients seek medical attention in settings where the language of care differs from the language in which they think, feel, and make sense of illness. This linguistic mismatch is not a peripheral inconvenience; it is a structural determinant of health. Language-concordant clinics emerge from this reality not merely as a service innovation, but as a reframing of communication itself as a core clinical intervention.


Language-concordant care occurs when patients and clinicians share a common language and are able to communicate directly, fluently, and comfortably throughout the clinical encounter. In language-concordant clinics, this principle is embedded at the organizational level: appointments, workflows, educational materials, and clinical interactions are intentionally designed to occur in the patient's preferred language. This model goes beyond episodic interpretation and establishes linguistic alignment as a foundational element of care delivery.


The clinical consequences of language discordance are well documented. When communication is filtered through language barriers, patients experience diminished understanding of diagnoses, reduced participation in decision-making, lower satisfaction, and increased vulnerability during critical moments such as consent, discharge, and treatment planning. These effects are not limited to subjective experience. Language discordance has been associated with higher rates of medical errors, longer hospital stays, delayed care, and poorer control of chronic disease. In this context, language is not simply a vehicle for information exchange; it shapes trust, safety, and clinical outcomes.


Trust occupies a central position in the therapeutic relationship, and language is one of its most powerful determinants. Trust requires honesty, clarity, and the ability to express concerns, fears, and values without hesitation. When patients must rely on intermediaries to communicate intimate or complex information, trust becomes fragile. Even when professional interpreters are used appropriately, the presence of a third party can alter conversational flow, limit spontaneity, and subtly constrain disclosure. Language-concordant encounters, by contrast, allow patients to speak in their own voice and clinicians to respond with nuance, empathy, and immediacy. This directness fosters a sense of being heard and respected, which in turn strengthens engagement and adherence.


Language-concordant clinics do not dismiss the essential role of professional interpreters. Interpreters remain critical for ensuring access, equity, and legal compliance, particularly when language-concordant clinicians are unavailable. However, evidence consistently shows that interpreter-mediated encounters, while superior to ad hoc or absent interpretation, do not fully replicate the relational depth of direct communication. Patients in language-concordant settings are more likely to ask questions, clarify uncertainties, and actively participate in their care. This increased engagement is especially evident in pediatric and family-centered contexts, where caregivers must understand and consent to complex interventions on behalf of others.


Informed consent represents one of the most ethically sensitive domains in medicine, and language discordance poses persistent risks to its integrity. Consent requires not only the transmission of information but the assurance that information is understood. When consent discussions occur in a non-preferred language or through inconsistent interpretation, comprehension may be partial, documentation incomplete, and patient autonomy compromised. Language-concordant clinics reduce these risks by aligning the consent process with the patient's linguistic reality, thereby reinforcing both ethical standards and patient safety.


The benefits of language-concordant care extend beyond individual encounters to system-level outcomes. Clinics that operate in a shared language often demonstrate improved efficiency, fewer misunderstandings, and smoother clinical workflows. Time that might otherwise be spent clarifying miscommunication or correcting errors is redirected toward meaningful clinical engagement. In emergency and inpatient settings, professional interpretation modalities—particularly video-based options—have demonstrated improvements in comprehension and satisfaction, yet even these modalities are often underutilized due to workflow barriers and cultural habits. Language-concordant clinics bypass many of these obstacles by embedding communication fluency directly into care delivery.


Importantly, language-concordant clinics also serve as a lens through which broader social determinants of health become visible. Language is intertwined with migration history, educational opportunity, socioeconomic status, and exposure to trauma. Patients who prefer non-dominant languages often navigate health systems shaped by structural inequities that extend far beyond communication. When care is delivered in a shared language, clinicians gain deeper insight into patients' lived experiences, belief systems, and contextual challenges. This understanding enables more culturally responsive care and more realistic treatment planning.


The educational implications of language-concordant clinics are profound. As health systems become more linguistically diverse, the preparation of clinicians must evolve accordingly. Linguistic competence cannot be treated as an informal skill acquired incidentally or self-declared without assessment. Safe language-concordant care requires rigorous training, standardized evaluation, and clear institutional policies defining when clinicians are qualified to practice in a non-dominant language. Without these safeguards, well-intentioned efforts risk introducing new forms of error and inequity.


Technology offers emerging opportunities to support language-concordant care, particularly in settings where bilingual clinicians are scarce. Digital interpretation platforms, video-based services, and AI-assisted translation tools have demonstrated potential to expand access and reduce delays. However, these tools must be implemented thoughtfully. Technology can enhance communication, but it cannot substitute for linguistic competence, cultural humility, or relational trust. Moreover, AI-generated translations require careful oversight to ensure accuracy, contextual appropriateness, and patient safety. In language-concordant clinics, technology functions best as an adjunct rather than a replacement for human connection.


The implementation of language-concordant clinics requires institutional commitment. Scheduling systems must align patients with linguistically appropriate providers. Educational materials must be available in relevant languages. Clinical teams must be trained to recognize language preference not as a binary attribute but as a spectrum shaped by context, stress, and health literacy. Quality improvement initiatives should track language alignment as a measurable dimension of care quality, alongside traditional clinical metrics.


Critically, language-concordant clinics challenge the assumption that interpretation alone is sufficient to achieve equity. While interpretation is indispensable, equity demands more than access; it demands resonance. Resonance occurs when patients recognize themselves in the language of care, when their concerns are not translated but expressed directly, and when clinicians listen without filters. Language-concordant clinics institutionalize this resonance.


From a policy perspective, language-concordant clinics represent an investment in prevention. By reducing misunderstandings, enhancing adherence, and strengthening trust, these clinics mitigate downstream costs associated with complications, readmissions, and disengagement from care. They also signal respect for linguistic diversity as an asset rather than a barrier, reframing multilingualism as a clinical resource.


Ultimately, language-concordant clinics reaffirm a fundamental truth: medicine begins not with technology or protocol, but with human interaction. Every diagnosis, every decision, and every act of healing is negotiated through language. When that language is shared, care becomes not only more effective, but more humane. Language-concordant clinics do not merely translate medicine; they restore its voice.


References:
1- Molina RL, Kasper J: The power of language-concordant care: a call to action for medical schools. BMC Medical Education. 19(1):378, 2019
2- Boylen S, Cherian S, Gill FJ, Leslie GD, Wilson S: Impact of professional interpreters on outcomes for hospitalized children from migrant and refugee families with limited English proficiency: a systematic review. JBI Evidence Synthesis. 18(7):1360–1388, 2020
3- Daggett A, Abdollahi S, Hashemzadeh M: The effect of language concordance on health care relationship trust score. Cureus. 15(5):e39530, 2023
4- Sharfuddin N, Mathura P, Mac A, Ling E, Tan M, Khatib E, Suranyi Y, Kassam N: Advancing language concordant care: a multimodal medical interpretation intervention. BMJ Open Quality. 13(1):e002511, 2024
5- Dzuali F, Seiger K, Novoa R, Aleshin M, Teng J, Lester J, Daneshjou R: ChatGPT may improve access to language-concordant care for patients with non–English language preferences. JMIR Medical Education. 10:e51435, 2024

Splenoportal Thrombosis After Splenectomy

Splenoportal thrombosis represents one of the most important vascular complications following splenectomy, emerging at the intersection of surgical physiology, hematologic adaptation, and altered portal venous hemodynamics. Although splenectomy remains a highly effective therapeutic intervention for hematologic disease, portal hypertension, trauma, and selected oncologic conditions, removal of the spleen initiates profound systemic and regional circulatory changes that predispose patients to thrombosis within the splenic, portal, and mesenteric venous systems. The growing body of contemporary literature has clarified that this complication is neither rare nor incidental but rather a predictable biological consequence that demands structured prevention, early recognition, and individualized management.


The incidence of splenoportal thrombosis after splenectomy varies widely across clinical populations, reflecting differences in underlying disease, surgical indication, and surveillance strategies. Reported rates generally fall between approximately 5% and more than 20%, with higher values observed when routine postoperative imaging is performed rather than symptom-driven investigation. Some analyses suggest even broader ranges in high-risk populations, particularly those with portal hypertension or cirrhosis, underscoring that the true incidence may historically have been underestimated due to asymptomatic cases remaining undetected.


Clinically, splenoportal thrombosis encompasses thrombosis involving the splenic vein, portal vein, superior mesenteric vein, or combinations of these vessels. The condition may present subtly, with abdominal pain, fever, nausea, or nonspecific malaise, but can progress to severe complications including intestinal ischemia, portal hypertension exacerbation, or hepatic dysfunction if untreated. Importantly, many patients remain asymptomatic during early stages, making imaging surveillance a critical component of postoperative care in selected populations.


The pathophysiology of thrombosis after splenectomy is multifactorial and best understood through Virchow's triad: hypercoagulability, endothelial injury, and venous stasis. Splenectomy induces an immediate hematologic shift characterized by increased circulating platelets and procoagulant factors. Postoperative thrombocytosis occurs in a majority of patients and may reach extreme levels, although the direct relationship between platelet elevation and thrombosis remains complex. Some investigations demonstrate that elevated platelet counts correlate with thrombotic risk and may serve as predictive thresholds, while others suggest thrombosis occurs independently of thrombocytosis alone, indicating that platelet number is only one component of a broader prothrombotic environment.


Beyond hematologic alterations, splenectomy dramatically modifies portal venous flow dynamics. Removal of the spleen eliminates a major inflow contributor to the portal system, producing abrupt changes in velocity patterns, pressure gradients, and wall shear stress. Computational modeling studies have demonstrated that areas of low wall shear stress increase after splenic vein ligation, creating localized hemodynamic environments conducive to clot formation. These changes are influenced by anatomical variables such as splenic vein diameter and portal venous geometry, suggesting that patient-specific vascular architecture plays a central role in thrombosis susceptibility.


Surgical factors further modulate risk. Longer operative times, greater tissue manipulation, and technical complexity appear associated with higher thrombotic incidence, likely reflecting both inflammatory activation and prolonged venous stasis. Massive splenomegaly has emerged as one of the strongest predictors, particularly when specimen weight exceeds one kilogram. Large spleens are associated with increased venous caliber and altered postoperative flow redistribution, amplifying stasis within the portal circulation. Hematologic disorders such as myeloproliferative disease and myelofibrosis also confer elevated risk, combining intrinsic hypercoagulability with anatomical changes.


Interestingly, the surgical approach itself—open versus laparoscopic—does not appear to independently determine thrombosis rates. While minimally invasive splenectomy offers reduced postoperative morbidity and shorter hospitalization, venous thrombotic incidence remains comparable between approaches. This observation reinforces the concept that thrombosis arises primarily from physiological consequences of splenic removal rather than technical modality. Temporal patterns of thrombosis formation provide additional insight. Most events occur within the first two to three postoperative weeks, although risk may persist for months. This delayed vulnerability highlights a critical limitation of traditional in-hospital prophylaxis strategies that terminate anticoagulation at discharge. Evidence increasingly suggests that extended thromboprophylaxis significantly reduces thrombotic events without substantially increasing bleeding complications. Patients receiving anticoagulation beyond hospitalization demonstrate markedly lower thrombosis rates compared with those treated only perioperatively.


The role of anticoagulation has therefore become central in preventive strategies. Analyses evaluating postoperative anticoagulant therapy indicate that low–molecular weight heparin followed by oral anticoagulation can substantially decrease portal venous thrombosis incidence, particularly during the first six postoperative months. Importantly, concerns regarding excessive bleeding risk have not been consistently supported by clinical outcomes, suggesting that carefully selected prophylactic regimens are both effective and safe.


Despite these advances, universal anticoagulation remains controversial. Not all patients carry equal risk, and indiscriminate therapy may expose low-risk individuals to unnecessary complications. Consequently, recent research has focused on risk prediction models designed to identify patients most likely to develop thrombosis. These models integrate clinical, laboratory, and anatomical variables, achieving moderate predictive accuracy with discrimination values generally indicating reliable but imperfect performance. While promising, many models suffer from limited external validation and retrospective design, emphasizing the need for individualized clinical judgment rather than reliance on algorithms alone.


Emerging predictive approaches extend beyond clinical scoring systems into computational hemodynamics. Patient-specific modeling of portal venous circulation allows simulation of postoperative flow conditions, enabling identification of regions predisposed to thrombosis before surgery occurs. Such techniques represent a potential paradigm shift, moving prevention from reactive monitoring toward personalized preoperative risk stratification. Although still investigational, these tools highlight the growing convergence of surgery, imaging, and computational medicine in perioperative risk assessment.


Diagnosis of splenoportal thrombosis relies primarily on imaging modalities. Contrast-enhanced computed tomography remains the most commonly used diagnostic tool, offering high sensitivity and anatomical detail, while Doppler ultrasonography provides a noninvasive alternative suitable for screening and follow-up. Given the frequency of asymptomatic presentation, routine imaging protocols have been advocated for high-risk patients, particularly during the early postoperative period when intervention is most effective.


Management strategies are generally successful when thrombosis is detected early. Anticoagulation alone leads to recanalization or complete resolution in most patients, with low recurrence rates reported during long-term follow-up. Severe complications are uncommon when treatment is initiated promptly, reinforcing the importance of vigilance rather than aggressive intervention. Surgical or interventional radiologic procedures are rarely required and are typically reserved for cases complicated by bowel ischemia or extensive thrombosis.


From a broader perspective, splenoportal thrombosis illustrates how removal of a single organ can disrupt systemic equilibrium. The spleen functions not only as an immunologic and hematologic organ but also as a regulator of portal circulation. Its absence transforms vascular dynamics, coagulation balance, and endothelial biology simultaneously. The postoperative state therefore represents a transitional physiological condition rather than a simple recovery phase, demanding targeted monitoring and adaptive management.


Future directions in the field increasingly emphasize personalization. Integration of platelet kinetics, spleen size, disease etiology, operative characteristics, and hemodynamic modeling may eventually permit individualized prophylaxis protocols tailored to each patient's risk profile. Advances in imaging analytics and machine learning may further refine prediction accuracy, allowing clinicians to intervene selectively while minimizing overtreatment.


In summary, splenoportal thrombosis after splenectomy is a common and clinically significant complication driven by complex interactions between hypercoagulability, altered venous flow, and patient-specific anatomical factors. Although often silent initially, it carries potential for severe morbidity if unrecognized. Contemporary evidence supports extended thromboprophylaxis in selected patients, early imaging surveillance, and prompt anticoagulation upon diagnosis. The evolution of predictive models and computational hemodynamic analysis signals a transition toward precision perioperative care. As understanding of postoperative portal physiology continues to expand, prevention and management strategies are likely to become increasingly individualized, improving outcomes while preserving the therapeutic benefits of splenectomy.


Ultimately, recognition of splenoportal thrombosis as an expected physiological risk rather than an unpredictable complication represents the most important conceptual advance. Through systematic risk assessment, vigilant monitoring, and tailored prophylaxis, the complication can be anticipated, detected early, and treated effectively, transforming a once underappreciated hazard into a manageable aspect of modern splenic surgery.


References:
1- Rottenstreich A, Kleinstern G, Spectre G, Da'as N, Ziv E, Kalish Y: Thromboembolic Events Following Splenectomy: Risk Factors, Prevention, Management and Outcomes. World J Surg. 42(3):675-681, 2018
2- Szasz P, Ardestani A, Shoji BT, Brooks DC, Tavakkoli A: Predicting venous thrombosis in patients undergoing elective splenectomy. Surg Endosc. 34(5):2191-2196, 2020
3- Swinson B, Waters PS, Webber L, Nathanson L, Cavallucci DJ, O'Rourke N, Bryant RD: Portal vein thrombosis following elective laparoscopic splenectomy: incidence and analysis of risk factors. Surg Endosc. 36(5):3332-3339, 2022
4- Liao Z, Wang Z, Su C, Pei Y, Li W, Liu J: Long term prophylactic anticoagulation for portal vein thrombosis after splenectomy: A systematic review and meta-analysis. PLoS One. 18(8):e0290164, 2023
5- Wang T, Yong Y, Ge X, Wang J: A computational model-based study on the feasibility of predicting post-splenectomy thrombosis using hemodynamic metrics. Front Bioeng Biotechnol. 11:1276999, 2024
6- Huang L, Han Y, Li Y, Li J: Risk prediction models for portal vein thrombosis (PVT) in patients after splenectomy: A systematic review and meta-analysis. Eur J Surg Oncol. 51(10):110300, 2025



Home
Table
Index
Past
Review
Submit
Techniques
Editor
Handbook
Articles
Download
UPH
Journal Club
WWW
Meetings
Videos