Current Issue : Year : 2026 – Volume: 16 Issue: 2
Current Issue
Year : 2026 – Volume: 16 Issue: 2
Current Issue Articles
Original Research Article
EFFECT ON THE NEWBORN OF USING MONOPOLAR CAUTERY DURING ABDOMINAL ENTRY IN CAESAREAN SECTIONS
http://dx.doi.org/10.70034/ijmedph.2026.2.1
Anshika Kashyap, Deepa Mathur, Alpa Bhosale, Ninad Gharat
View Abstract
Background: Caesarean section is one of the most frequently performed obstetric procedures, and electrosurgical devices such as monopolar cautery are increasingly used during abdominal entry to improve hemostasis and reduce operative time. Objective: To assess the effect on the newborn of using monopolar cautery during abdominal entry in elective cesarean sections compared with the conventional scalpel technique. Materials and Methods: This prospective randomized case–control study was conducted at a tertiary level medical college and hospital from Jan 25 to December 25, including 100 women undergoing elective lower segment cesarean section at term. Results: In this study of 100 patients (50 in each group), baseline characteristics were comparable between the cautery and no cautery groups. Mean maternal age was 29.9 ± 3.8 vs 30.2 ± 4.2 years, gestational age 38.27 ± 0.69 vs 38.37 ± 0.72 weeks, and birth weight 2.95 ±0.25 vs 2.93 ± 0.26 kg. Apgar scores improved similarly from 8.40 ± 0.25 vs 8.27 ± 0.26 at 1 minute to 9.33 ± 0.67 vs 9.40 ± 0.53 at 10 minutes. Cord pH (7.29 ± 0.07 vs 7.30 ± 0.05), lactate (2.6 ± 0.9 vs 2.4 ± 0.8 mmol/L), and glucose (76.8 ± 9.7 vs 78.3 ± 10.2 mg/dL) were comparable. Previous LSCS was the most common indication (64% vs 62%), and NICU admissions were low (6% vs 8%). Conclusion: The use of monopolar cautery during abdominal entry in elective cesarean sections does not adversely affect immediate neonatal outcomes. Monopolar cautery appears to be a safe and effective technique that can be used without compromising newborn well-being. Keywords: Cesarean section, monopolar cautery, electrocautery, neonatal outcomes, Apgar score, cord blood pH.
Page No: 1-6 | Full Text
Original Research Article
A STUDY OF OUTCOME OF LONG BONE FRACTURES IN PAEDIATRIC PATIENTS MANAGED WITH TITANIUM ELASTIC NAILING SYSTEM IN TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.2
Suyog Patole, Sujay Mahadik, Mrigank Goel, Vedant Kadu, Satish Mehta
View Abstract
Background: Titanium Elastic Nailing System (TENS) has become a preferred minimally invasive method for stabilizing paediatric diaphyseal long bone fractures, offering balanced flexible fixation that preserves periosteal blood supply and allows early mobilization. This study assessed short-term functional and radiological outcomes, pain trajectory, malalignment, limb length discrepancy (LLD), and complications following TENS fixation in a tertiary care setting. Materials and Methods: A prospective cohort study was conducted over 12 months in the department of orthopaedics of a tertiary care hospital. Institutional ethics committee approved the study and informed and written consent was obtained from guardians of the cases. Children with simple, minimally displaced diaphyseal long bone fractures treated with TENS were included in this study on the basis of a predefined inclusion and exclusion criteria. Convenience sampling yielded 25 patients (sample size based on pilot proportion). Baseline demographics, injury mechanism, fracture pattern and final diagnosis were documented. Follow-up evaluations were performed postoperatively, at 1 month, and at 3 months. Functional outcomes were graded using Flynn et al TENS scoring system. Radiographs were assessed for alignment and union. Complications, pain, malalignment grades, and LLD were documented. SSPS 29.0 was used for statistical analysis and p value less than 0.05 was considered as statistically significant. Results: Mean age was 8.96±2.91 years; 68% were 5–10 years and 60% were male. Self-fall was the commonest mechanism (80%). Transverse fractures predominated (80%); femoral shaft fractures formed the largest diagnostic group. Malalignment <5° was maintained in 76% postoperatively and 80% at 3 months, with no significant change over follow-up (p=0.725). Pain decreased significantly to 0% at 3 months (p<0.0001). Minor complications occurred in 20% at 1 and 3 months (p=0.0139). Radiologically, 96% achieved union by 3 months. Malunion was reported in 1 (4%) patient. Mean LLD remained small (0.61–0.65 cm across intervals). Excellent functional outcomes were seen in 80% at 3 months. Conclusion: In selected paediatric diaphyseal long bone fractures, TENS provided reliable alignment control, rapid pain resolution, high union rates, minimal LLD, and predominantly excellent short-term functional outcomes with low minor complication rates. Keywords: Pediatrics long bone Fractures; Femur Fractures; Intramedullary Nailing; Treatment Outcome.
Page No: 7-13 | Full Text
Original Research Article
APRI SCORE AND FIB-4 INDEX AS NON-INVASIVE PREDICTORS OF ESOPHAGEAL VARICES IN LIVER CIRRHOSIS
http://dx.doi.org/10.70034/ijmedph.2026.2.3
T. Sowjanya Lakshmi, Rahul Conjeevaram, Maganti Yamuna, Jyothi Conjeevaram, Padamtinti Anurag Rao
View Abstract
Background: Cirrhosis leads to portal hypertension and life-threatening complications like esophageal variceal bleeding. While endoscopy is the gold standard for diagnosis, its invasive nature and cost, necessitate reliable non-invasive predictors. Aim: To evaluate the Aspartate Aminotransferase to Platelet Ratio Index (APRI) score and Fibrosis-4 (FIB-4) as non-invasive predictors of esophageal varices in liver cirrhosis. Materials and Methods: This observational study at Narayana Medical College Hospital (January–July 2024) included 105 cirrhotic patients. APRI and FIB-4 scores were calculated and compared against endoscopic findings. Results: Esophageal varices were present in 65.7% of patients. An APRI cut-off of ≥ 0.9 demonstrated 74.5% sensitivity, 61.1% specificity, and a significant association with high-risk varices (p = 0.0005).Binary logistic regression analysis revealed that patients with an APRI score of ≥ 0.9 had nearly five times the odds of harbouring high-risk varices compared to patients with an APRI score < 0.9 (OR = 4.59; 95% CI: 2.00–10.58; p = 0.0003). A FIB-4 cut-off of ≥ 2.78 showed higher sensitivity (82.3%) but lower specificity (40.7%), with a significant association (p = 0.0174).Patients with a FIB-4 score ≥ 2.78 had more than five times the odds of having esophageal varices compared to patients with a score below the 2.78 threshold (Odds Ratio [OR] = 5.31; 95% Confidence Interval [CI], 2.15 – 13.10; p < 0.001). Conclusion: Both indices are valuable tools for risk stratification. APRI serves as a reliable positive predictor for identifying high-risk varices, while FIB-4’s high sensitivity makes it an effective primary screening tool to rule out severe disease in resource-limited settings Keywords: APRI, FIB-4, Esophageal varices, Non invasive predictor, Cirrhosis.
Page No: 14-20 | Full Text
Original Research Article
PREDICTORS OF WEIGHT CHANGE DURING METFORMIN THERAPY IN ADULTS WITH TYPE 2 DIABETES : AN OBSERVATIONAL COHORT STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.4
Ambili Remesh, Mohammed Jaseem Ibrahim
View Abstract
Background: Metformin is the most prescribed first-line medication for people with type 2 diabetes mellitus (T2DM). In addition to its blood sugar-lowering effect, metformin is known to be associated with weight stabilisation or even slight weight reduction. However, the amount and type of weight changes are highly variable with metformin. This study aims to determine the weight-reducing potential of the antihyperglycemic agent. Understanding other factors causing weight changes could help improve treatment plans and metabolic outcomes. Materials and Methods: This prospective observational cohort study was conducted on an outpatient basis. We used consecutive sampling during routine outpatient visits to enrol 300 adult patients with type 2 diabetes mellitus who had been on metformin for at least six months. We gathered demographic, clinical, and metabolic information of the participants at baseline. For three months, we conducted follow-ups and recorded their body weight. Data was analysed using paired t-tests and regression modelling to assess weight changes and identify variables that may contribute to weight loss. Results: The participants had an average age of 52.4 years, with a standard deviation of 9.3 years. The group was made up of 54% men and 46% women. At the beginning of the study, the average body mass index (BMI) was 28.3 kg/m², with a standard deviation of 4.2 kg/m². During the follow-up period, 42% of patients lost weight, 36% maintained their weight, and 22% gained weight. A higher baseline BMI and a shorter duration of diabetes were strongly correlated with weight loss with metformin medication. Age showed a substantial correlation with weight change, whereas gender did not. Conclusion: Metformin medication is linked to moderate weight loss in a significant number of individuals with type 2 diabetes. Baseline BMI and diabetes duration appear to be major predictors of weight change. This shows how vital it is to analyse each patient individually when managing diabetes. With appropriate lifestyle changes, we can postulate that the treatment results can be further optimised. Keywords: Metformin, Type 2 Diabetes Mellitus, Weight Change, Body Mass Index, Predictors, Glycemic Control.
Page No: 21-26 | Full Text
Original Research Article
POST-OPERATIVE EPIDURAL ANALGESIA AFTER TKR: A DOUBLE-BLIND RANDOMIZED COMPARISON OF ROPIVACAINE 0.2% AND LEVOBUPIVACAINE 0.125%
http://dx.doi.org/10.70034/ijmedph.2026.2.5
Harshwardhan Tikle, Megha Harshwardhan Tikle, Kadali Lakshmi Sudha, Parasmita Bhattacharjee
View Abstract
Background: Total knee replacement (TKR) is associated with significant postoperative pain, which can delay rehabilitation and prolong hospital stay. Epidural analgesia using long-acting local anesthetics like levobupivacaine and ropivacaine is widely used for effective pain control. The aim is to compare the analgesic efficacy of epidural levobupivacaine (0.125%) and ropivacaine (0.2%) in patients undergoing TKR. Materials and Methods: This prospective, double-blind randomized study included 110 patients undergoing TKR, divided into two groups: Group L (levobupivacaine 0.125%, n=55) and Group R (ropivacaine 0.2%, n=55). Postoperative epidural infusion was administered, and patients were evaluated using Visual Analogue Scale (VAS) scores at rest and on movement at different time intervals. Hemodynamic parameters, sensory regression, motor blockade (Modified Bromage Scale), and adverse effects were also recorded. Results: Demographic parameters were comparable between groups. VAS scores at rest and during movement were similar at all time intervals (p>0.05). Group R demonstrated significantly lower pulse rate and systolic blood pressure at multiple time points (p<0.05), indicating better hemodynamic stability. Sensory regression to L1 was significantly prolonged in Group R (28.53 ± 13.463 hrs) compared to Group L (22.78 ± 11.263 hrs; p=0.018). Motor blockade and incidence of side effects were comparable in both groups. Conclusion: Both levobupivacaine and ropivacaine provide effective postoperative epidural analgesia following TKR. However, ropivacaine offers better hemodynamic stability and longer sensory blockade, making it a preferable choice. Keywords: Total knee replacement, Epidural analgesia, Ropivacaine, Levobupivacaine, Visual Analogue Scale, Postoperative pain.
Page No: 27-32 | Full Text
Original Research Article
PREOPERATIVE NEUTROPHIL–LYMPHOCYTE RATIO AS A PREDICTOR OF OPERATIVE DIFFICULTY AND POSTOPERATIVE OUTCOMES IN OPEN CHOLECYSTECTOMY: A RETROSPECTIVE STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.6
Manu Singh, Bheeni Bharti, Rahul Dhariwal
View Abstract
Background: Gallstone disease is a common gastrointestinal condition affecting a large proportion of adults worldwide. Although laparoscopic cholecystectomy is widely accepted as the standard treatment, open cholecystectomy remains essential in complicated cases. Identifying predictors of surgical difficulty and postoperative complications is therefore clinically important. The objective is to evaluate the association between preoperative neutrophil–lymphocyte ratio (NLR) and surgical outcomes in patients undergoing open cholecystectomy. Materials and Methods: A retrospective observational study was conducted in the Department of Anesthesia Kalyan Singh Government Medical College Bulandshahr, Uttar Pradesh, India from August 2024 to August 2025. Eighty patients undergoing open cholecystectomy were included. Patients were divided into two groups based on preoperative NLR values: NLR <3 and NLR ≥3. Operative time, difficult dissection, postoperative complications, and duration of hospital stay were recorded. Results: Patients with elevated NLR (≥3) had longer operative time (80 vs 55 minutes), higher incidence of difficult dissection (35% vs 10%), higher surgical site infection rates (18% vs 4%), and prolonged hospital stay (6 vs 3 days). Conclusion: Preoperative NLR is a simple and inexpensive biomarker that may help predict operative difficulty and postoperative complications in open cholecystectomy. Keywords: Neutrophil-lymphocyte ratio, gallstone disease, open cholecystectomy, inflammation, surgical outcomes.
Page No: 33-35 | Full Text
Original Research Article
AGE-RELATED VARIATIONS IN LIP PRINT PATTERNS: A CHEILOSCOPIC STUDY IN A SUBURBAN POPULATION OF DAKSHINA KANNADA DISTRICT
http://dx.doi.org/10.70034/ijmedph.2026.2.7
Fouzia Hameed, Vina Vaswani, Viswakanth B, Shahnavaz Manipady, Zahida Niaz, Aysha Hadia
View Abstract
Background: Cheiloscopy, the study of lip print patterns, has emerged as a useful adjunct in forensic identification due to the unique and relatively stable nature of lip groove patterns. Although several studies have evaluated the distribution of lip print patterns, limited research has focused on age-related variations in these patterns. The present study aimed to evaluate the distribution of lip print patterns across different age groups in a suburban population of Dakshina Kannada district. Materials and Methods: A cross-sectional study was conducted on 300 individuals aged 12–60 years residing in a suburban region of Dakshina Kannada district. The study population was categorized into three age groups: 12–20 years, 21–40 years, and 41–60 years. Lip prints were recorded using the cellophane tape method after applying lipstick and were analyzed using the Suzuki and Tsuchihashi classification. Data were analyzed using SPSS software, and the Chi-square test was used to assess the association between age groups and lip print patterns. Results: Among the five lip print patterns identified, Type I pattern was the most predominant (32.0%), followed by Type V (20.7%), Type IV (16.3%), Type II (15.7%), and Type III (15.3%). Age-wise analysis showed that Type I pattern was more common in the younger age group (44.8%), while Type III and Type V patterns were relatively more frequent in the older age group. However, statistical analysis showed no significant association between age group and lip print pattern distribution (χ² = 12.84, p = 0.118). Conclusion: The findings of this study indicate that lip print patterns remain largely stable across different age groups, supporting their potential use as a supplementary tool for personal identification in forensic investigations. Keywords: Cheiloscopy; Lip prints; Age-related variation; Suzuki and Tsuchihashi classification; Dakshina Kannada.
Page No: 36-41 | Full Text
Original Research Article
FORENSIC UTILITY OF CHEILOSCOPY IN PERSONAL IDENTIFICATION: A POPULATION-BASED STUDY FROM DAKSHINA KANNADA DISTRICT
http://dx.doi.org/10.70034/ijmedph.2026.2.8
Fouzia Hameed, Vina Vaswani, Viswakanth B, Shahnavaz Manipady, Zahida Niaz, Aysha Hadia
View Abstract
Background: Cheiloscopy, the study of lip print patterns, has been increasingly explored as an adjunct method for personal identification in forensic investigations. The characteristic grooves and fissures present on the vermilion border of the lips form distinct morphological patterns that may assist in identifying individuals. The present study aimed to evaluate the distribution of lip print patterns and assess the forensic utility of cheiloscopy in a population from Dakshina Kannada district. Materials and Methods: A population-based cross-sectional study was conducted on 300 individuals aged 12–60 years residing in suburban areas of Dakshina Kannada district. Lip prints were recorded using the cellophane tape method after applying lipstick and were analyzed according to the Suzuki and Tsuchihashi classification. Descriptive statistics were used to determine the frequency distribution of lip print patterns, and the Chi-square test was applied to assess the association between gender and lip print pattern distribution. Results: Among the five lip print patterns identified, Type I pattern was the most predominant (32.0%), followed by Type V (20.7%), Type IV (16.3%), Type II (15.7%), and Type III (15.3%). Gender-wise comparison showed that Type I pattern was the most common pattern in both males and females. However, statistical analysis revealed no significant association between gender and lip print pattern distribution (χ² = 5.236, p = 0.264). A majority of lip prints (79.3%) exhibited identifiable patterns (Type I–IV), indicating their potential usefulness for forensic identification. Conclusion: The findings of this study demonstrate that a large proportion of lip prints possess identifiable morphological patterns, supporting the use of cheiloscopy as a supplementary method for personal identification in forensic investigations. Keywords: Cheiloscopy; Forensic identification; Personal identification; Suzuki and Tsuchihashi classification; Dakshina Kannada.
Page No: 42-47 | Full Text
Original Research Article
A PROSPECTIVE OBSERVATIONAL STUDY OF PEDIATRIC PROXIMAL PHALANX FRACTURES: EPIDEMIOLOGY, CLINICAL FEATURES, MANAGEMENT, AND OUTCOMES
http://dx.doi.org/10.70034/ijmedph.2026.2.9
Kondamudi Srinivasu MBBS, DNB, MCh, Palli Shirin MBBS, MS, MCh, Dr NB, Kommu Vijay Babu MBBS, MS, MCh
View Abstract
Background: Proximal phalanx fractures are the most frequent hand fractures in children. Although most injuries are treated nonoperatively, outcomes vary according to fracture pattern, anatomical location, degree of displacement, and the presence of associated soft tissue injury. The objective is to evaluate the clinical profile, management patterns, and functional and radiological outcomes of proximal phalanx fractures in children aged 12 years or younger. Materials and Methods: This prospective study was conducted from January 2024 to January 2025 and included 24 children with proximal phalanx fractures. Data were collected on age, sex, mechanism of injury, fracture location and type, associated injuries, treatment modality, and follow-up outcomes. Pain was assessed using the Visual Analog Scale [VAS], functional outcome was evaluated by range of motion, and residual deformity was assessed on follow-up radiographs. Results: The study included 24 children, comprising 16 males and 8 females, with a mean age of 9 years. Sports-related trauma was the most common mechanism of injury, accounting for 58.3% of cases. The base of the proximal phalanx was the most frequently affected site [50.0%]. Conservative management with a volar slab was successful in 14 patients, whereas open reduction was required in 1 patient. At a mean follow-up of 9 months, 91.7% of patients were pain-free. Residual deformity was noted in 2 patients. Stiffness and restricted range of motion were primarily observed in children with associated soft tissue injury or extensor tendon involvement. Conclusion: Proximal phalanx fractures in children generally have favorable outcomes with conservative treatment. However, displaced fractures and injuries associated with soft tissue damage may require surgical intervention. Adherence to physiotherapy appears to play an important role in optimizing functional recovery. Keywords: children; proximal phalanx fracture; pediatric hand fractures; conservative management; functional outcome; range of motion; soft tissue injury.
Page No: 48-53 | Full Text
Case Report
DERMATOFIBROSARCOMA PROTUBERANS OF THE ANTERIOR CHEST WALL- A CASE REPORT
http://dx.doi.org/10.70034/ijmedph.2026.16.2.10
Divya, Dinesh Mahalingam, Lakshmi Narayanan Sankar, Chitra Tulasiram
View Abstract
Dermatofibrosarcoma protuberans (DFSP) is a rare, low-to-intermediate grade fibroblastic neoplasm of dermal origin characterized by locally aggressive behavior and a high propensity for recurrence if inadequately excised. It accounts for less than 0.1% of all malignancies and approximately 1% of soft tissue sarcomas. Although metastasis is rare (<5%), the infiltrative growth pattern with tentacle-like projections into surrounding tissue necessitates meticulous surgical management. We report the case of a 36-year-old male who presented with a slowly progressive plaque over the left anterior chest wall for five years. The lesion was initially misdiagnosed as tinea incognito. Clinical examination revealed a well-defined infiltrative plaque measuring 10 × 8 cm with nodular components, confined to the dermis and subcutaneous tissue. Punch biopsy demonstrated spindle cell proliferation arranged in a storiform pattern. Immunohistochemistry showed strong CD34 positivity and S-100 negativity, confirming DFSP. Contrast-enhanced CT scan revealed tumor confinement to superficial planes without deep muscle or bony invasion. The patient underwent wide local excision (WLE) with adequate margins followed by split skin graft reconstruction. Histopathology confirmed tumor-free margins with a low Ki-67 index (6%). Postoperative recovery was uneventful, with excellent graft uptake and no recurrence to date. This case emphasizes the importance of early recognition, histopathological confirmation, immunohistochemistry, and adequate surgical margins in achieving optimal outcomes. Long-term surveillance remains essential due to the risk of late recurrence.
Page No: 54-57 | Full Text
Original Research Article
A PROSPECTIVE OBSERVATIONAL STUDY ON MATERNAL AND PERINATAL OUTCOMES IN SINGLETON PREGNANCIES WITH ABNORMAL PRESENTATIONS IN TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.11
A Viplava, Swathi Rallabhandi, Amritha Aurora Meduri, Paka Sushritha
View Abstract
Background: Aim: To study the maternal and perinatal outcomes in singleton pregnancies with abnormal presentations. Materials and Methods: The study was designed as a single-center prospective observational investigation at the Government Maternity Hospital, Sultan Bazar, focusing on maternal and perinatal outcomes in singleton pregnancies with abnormal presentations. Spanning 24 months, the research included 150 antenatal women from 37 weeks gestation onward, confirmed by clinical and ultrasound examination. Exclusions were made for cases involving intrauterine death, congenital anomalies, and preterm births, using consecutive sampling of eligible participants. Data were collected using a semi- structured questionnaire, with thorough recordation of demographic and obstetric details, including gravidity, parity, and complications from previous pregnancies. Results: In the present study, the mode of delivery was predominantly through Lower Segment Caesarean Section (LSCS), which was necessary in 60% of the cases, reflecting the challenging nature of these pregnancies. Only 30% of the deliveries were normal vaginal births, indicating a tendency towards surgical intervention in complicated cases. Notably, age significantly influences the mode of delivery, with those under 20 years having a higher proportion of vaginal deliveries (55.6%) and LSCS least common (22.2%), a pattern that inversely correlates with increasing age, evident as LSCS dominates (84.3%) in those aged 30 years and above (P < 0.001). The AFI results highlight varying preferences for delivery mode based on fluid levels, with lower AFI favoring more balanced delivery methods and higher AFI leading predominantly to LSCS (92.2%) (P < 0.001). However, the type of labor, whether spontaneous or induced, shows no significant difference in delivery mode selection (P = 0.657), suggesting that other factors might have more substantial impacts on the decision-making process for the mode of delivery. Conclusion: The outcomes of this study reveal critical areas for potential improvement in perinatal care, particularly in the management of pregnancies complicated by abnormal presentations. The relatively high rates of NICU admissions illustrate the gravity of these cases and the imperative for enhanced prenatal monitoring and intervention strategies. The findings advocate for the development of refined guidelines that can assist healthcare providers in making informed decisions about the most suitable delivery methods, aiming to minimize complications and improve survival rates. Overall, this research contributes valuable insights into the impacts of delivery modes on maternal and perinatal health, serving as a basis for future studies and policy- making in obstetric care. Keywords: LSCS, Perinatal care, NICU, AFI, Vaginal Births.
Page No: 58-67 | Full Text
Original Research Article
A STUDY ON SERUM ADENOSINE DEAMINASE AS AN INDICATOR OF GLYCAEMIC STATUS IN TYPE 2 DIABETES MELLITUS
http://dx.doi.org/10.70034/ijmedph.2026.2.12
Thammadagoni Alivelu, Deva Mona, Earjala Raju, Shankar Chilumula
View Abstract
Background: Type 2 diabetes mellitus is a common metabolic disorder characterized by high blood glucose levels. Adenosine Deaminase (ADA) is an enzyme which catalyses the irreversible deamination of adenosine to inosine. Adenosine is known to exert potent metabolic effects acting through its receptors. Increased levels of serum ADA has been shown in individuals with type 2 diabetes mellitus. The study aims to understand the relationship between serum ADA and blood glucose levels. The objective is to estimate the value of serum adenosine deaminase (ADA) in patients with uncomplicated cases of type 2 diabetes mellitus and non diabetics control group. To estimate the association between serum ADA values among cases and control based on blood sugars. Materials and Methods: The study was conducted at Gandhi Medical college, Secunderabad. It involved 50 cases of uncomplicated type 2 diabetes mellitus and 50 age and sex matched healthy controls. Blood and urine samples were collected after obtaining an informed written consent from cases and controls. Serum Adenosine Deaminase levels were estimated colourimetrically, based on the method described by Giusti and Galanti. Serum FBS, PPBS, HbA1c levels, blood urea, serum Creatinine, lipid profile and urinary sugar and proteins were also measured simultaneously using routine laboratory methods. Results: A clear-cut elevation of serum ADA was found in diabetic subjects as compared to controls. In addition a very large positive correlation was found between serum ADA and blood glucose levels. Conclusion: This study shows ADA level is significantly related to the glycemic status in type 2 diabetes mellitus. Altered serum ADA level is an indirect expression of the tissue adenosine levels. Adenosine through its multiple metabolic effects is involved in the pathophysiology of diabetes. The present study highlights the importance of this metabolite and the need for further studies. Keywords: Type 2 diabetes mellitus; Adenosine Deaminase; Adenosine.
Page No: 68-71 | Full Text
Systematic Review
AWARENESS AND PRACTICES RELATED TO SMOKING AND ALCOHOL CESSATION PRIOR TO SURGERY: A SYSTEMATIC REVIEW
http://dx.doi.org/10.70034/ijmedph.2026.2.13
Divya Ravikumar, Fikret Ahmadov
View Abstract
Background: Smoking and excessive alcohol consumption are well-established risk factors for adverse postoperative outcomes, including wound complications, pulmonary infections, prolonged hospital stay, and increased morbidity and mortality. Evidence suggests that preoperative cessation of tobacco and alcohol significantly reduces surgical complications and improves recovery. Despite established clinical guidelines recommending cessation at least 4–8 weeks prior to elective surgery, patient awareness, adherence, and implementation of cessation practices remain inconsistent across healthcare settings. However, current evidence on awareness and real-world practices is fragmented. A comprehensive systematic review is therefore warranted. This systematic review aims to assess the level of awareness and the practices related to smoking and alcohol cessation prior to surgery among patients undergoing surgical procedures. Materials and Methods: This systematic review followed PRISMA guidelines and included English-language observational studies, randomized control trials, systematic review, meta-analysis, guidelines studies, focusing awareness and practices related to smoking and alcohol cessation prior to surgery and postoperative outcomes. Studies were included according to predefined inclusion criteria. Databases including PubMed/MEDLINE, Scopus, and Web of Science were searched using relevant keywords. Study selection and data extraction were performed, the risk of bias was assessed using appropriate design-specific tools, and findings were synthesized narratively. Results: Preoperative smoking and alcohol cessation reduces postoperative complications. Smoking cessation for at least 4 weeks before surgery significantly lowers pulmonary, wound, and overall complications, with greater benefits seen with longer cessation. Intensive cessation programs combining behavioral support and pharmacotherapy improve abstinence rates and may reduce postoperative morbidity. Combined smoking and alcohol use is associated with the highest risk of postoperative complications, readmission, and reoperation. Conclusion: Preoperative smoking and alcohol cessation interventions, particularly those lasting 4–8 weeks and involving behavioral and pharmacological support, improve abstinence rates and reduce postoperative complications, with greater benefits seen with longer cessation. However, further large-scale studies with standardized methods and long-term follow-up are needed to determine optimal intervention timing and long-term outcomes. Keywords: Smoking cessation, Alcohol cessation, Awareness and practices, surgery, Preoperative care, post operative outcomes.
Page No: 72-77 | Full Text
Systematic Review
LASER AND LIGHT-BASED THERAPIES FOR MELASMA: A SYSTEMATIC REVIEW
http://dx.doi.org/10.70034/ijmedph.2026.2.14
Alicia Kivork, Grigor Haryutunyan
View Abstract
Background: Melasma is a chronic facial hyperpigmentation disorder commonly affecting women of reproductive age, driven by ultraviolet and visible light exposure, hormonal factors, and dermal changes, with frequent recurrence. Although hydroquinone-based triple combination therapy remains first-line treatment, lasers and intense pulsed light (IPL) are increasingly used. This systematic review evaluates the efficacy and safety of laser and light-based therapies for melasma. The objective is to determine the most effective and safest laser modalities, compare outcomes across technologies and protocols, and identify research gaps to inform future studies and guide evidence-based clinical decision-making in melasma management. Materials and Methods: This systematic review was conducted according to PRISMA guidelines. PubMed, Scopus, Web of Science, and Google Scholar were searched using predefined keywords related to melasma and laser/light-based therapies. Randomized controlled trials and observational studies, review articles, meta-analysis were included. Two reviewers independently screened studies and extracted data. Owing to heterogeneity, results were synthesized qualitatively. Results: Laser and light-based therapies demonstrated variable efficacy in melasma management. Q-switched Neodymium-doped Yttrium Aluminium Garnet (Nd:YAG) showed temporary improvement but was associated with recurrence and pigmentary adverse effects. Fractional lasers and IPL provided moderate benefit, with combination IPL therapies enhancing outcomes. Picosecond lasers, particularly with diffractive lens arrays, showed superior efficacy, better patient satisfaction, and favorable safety profiles. Combination approaches improved long-term outcomes but increased the risk of mild adverse events. Conclusion: Picosecond lasers and combination-based strategies appear to offer the most promising balance of efficacy and safety. However, recurrence and procedure-related pigmentary changes remain concerns, highlighting the need for standardized protocols and larger controlled trials to optimize long-term melasma management. Keywords: Melasma, Laser therapy, Picosecond laser, Q-switched Nd:YAG, Fractional laser, Ablative laser, Non-ablative laser, Intense pulsed light (IPL), Combination therapy.
Page No: 78-83 | Full Text
Original Research Article
RECENT CHANGES IN THE CLINICAL SYMPTOMATOLOGY OF COMMUNITY-ACQUIRED PNEUMONIA IN CHILDREN
http://dx.doi.org/10.70034/ijmedph.2026.2.15
Kabita Ghose, Papiya Mistry, Faiza Ali
View Abstract
Background: Community-acquired pneumonia (CAP) is still one of the most prevalent reasons for kids to go to the hospital. The clinical manifestation of juvenile community-acquired pneumonia seems to be altering due to extensive immunization, enhanced nutrition, and shifting respiratory pathogen patterns. Classical symptoms like high fever and chest indrawing may not be as common now. Instead, more cases are being recorded with wheezing and viral-like symptoms. The objectives is to assess the present clinical symptomatology, severity indicators, and outcomes of community-acquired pneumonia in children admitted during a recent one-year timeframe. Materials and Methods: This retrospective observational study encompassed 100 children aged 1 month to 12 years, admitted with CAP from February 2025 to January 2026 at a tertiary care hospital. We looked at demographic information, symptoms, clinical indicators, oxygen saturation, laboratory results, radiographic findings, treatment, and outcomes. Results: Fever (93%) and cough (92%) were the most common symptoms that people had. 76% of the cases had tachypnea, although only twenty-four percent had chest indrawing. Wheezing was observed in 27% of children, suggesting an increasing wheeze-associated pneumonia phenotype. In 29% of cases, hypoxia (SpO₂ <92%) happened. 12% of patients had complicated pneumonia, and nine percent needed intensive care. Viral or unusual causes were suspected in over half of the cases based on clinical characteristics and lab results. Conclusion: The clinical manifestation of pediatric CAP is evolving towards diminished classical bacterial characteristics and an increased prevalence of wheeze-dominant and viral-like symptoms. These findings underscore the necessity to modify diagnostic and therapeutic strategies to align with current disease trends, thereby preventing overtreatment while facilitating the prompt recognition of severe cases. Keywords: Community-acquired pneumonia; pediatric population; clinical manifestations; wheezing; hypoxia; pediatric respiratory diseases.
Page No: 84-87 | Full Text
Original Research Article
COMPARISON OF SHORT-TERM CLINICAL OUTCOMES OF FIXED LOOP VERSUS ADJUSTABLE LOOP AS FEMORAL CORTICAL SUSPENSION DEVICES IN ARTHROSCOPIC ANTERIOR CRUCIATE LIGAMENT RECONSTRUCTION
http://dx.doi.org/10.70034/ijmedph.2026.2.16
Amit Garud, Nishant Gaonkar, Vaibhav Koli
View Abstract
Background: Various methods of femoral graft fixation are available, among which cortical loop fixation devices are most commonly used. This study aims to compare fixed loop and adjustable loop devices in terms of clinical outcomes. Materials and Methods: In this prospective randomized study, patients were divided into two groups based on the type of femoral fixation device used. Clinical outcomes were assessed at 6 months and 1 year using the International Knee Documentation Committee (IKDC) and Lysholm scores. Results: Both groups were comparable in terms of demographic characteristics and preoperative scores. Postoperatively, no statistically significant difference was observed between the two groups at 6 months or at 1 year. Conclusion: Arthroscopic ACL reconstruction yields comparable short-term clinical outcomes when either fixed loop or adjustable loop devices are used for femoral cortical fixation. Keywords: Fixed loop device, Adjustable loop device, Arthroscopic anterior cruciate ligament reconstruction.
Page No: 88-91 | Full Text
Original Research Article
A RETROSPECTIVE STUDY OF CONSERVATIVE MANAGEMENT IN SOLID ORGAN INJURY IN BLUNT ABDOMINAL TRAUMA WITH HEMOPERITONEUM AT A TERTIARY CARE CENTRE IN SOUTH GUJARAT
http://dx.doi.org/10.70034/ijmedph.2026.2.17
Bhavna P. Kala, Pavankumar M. Khunt, Aishwarya S. Champaneria
View Abstract
Background: Blunt abdominal trauma frequently results in solid organ injury with hemoperitoneum. In hemodynamically stable patients, non-operative management has increasingly become the preferred approach. This study aimed to evaluate the outcomes of conservative management and assess the utility of the Clinical Abdominal Scoring System (CASS) in such cases. Materials and Methods: A retrospective observational study was conducted at a tertiary care centre in South Gujarat from 2024 to 2026. A total of 50 hemodynamically stable patients with radiologically confirmed solid organ injury and hemoperitoneum were included. Data regarding demographics, mode of injury, clinical presentation, vital parameters, organ involvement, and CASS were collected from medical records and analyzed using descriptive statistics. Results: Among the 50 patients, 86% were males. The most affected age group was 18–25 years (36%). Road traffic accidents were the leading cause (60%). Most patients (68%) presented within 2 hours of injury. Abdominal pain was present in all cases, while vomiting and loss of consciousness were noted in 18% and 12%, respectively. The majority had pulse rate <90/min (70%) and systolic blood pressure between 90–120 mmHg (80%). Liver was the most commonly injured organ (62%), followed by spleen (26%), kidney (8%), and combined injuries (4%). Most patients had CASS scores between 8–12 (56%), while 44% had scores <8. All patients were managed conservatively with no requirement for operative intervention. Conclusion: Conservative management is highly effective in hemodynamically stable patients with solid organ injury. CASS is a valuable tool for clinical assessment and management planning. Keywords: Blunt abdominal trauma, hemoperitoneum, conservative management, solid organ injury, clinical abdominal scoring system.
Page No: 92-96 | Full Text
Original Research Article
RESISTANCE AND VIRULENCE IN CA-MRSA: CLINICAL AND MICROBIOLOGICAL SPECTRUM OF COMMUNITY-ACQUIRED SSTIs
http://dx.doi.org/10.70034/ijmedph.2026.2.18
Khushbu Sakure, Umesh Hassani, Vilas Thombare, Sunil Deshmukh, Ravindra Kadse
View Abstract
Background: Skin and soft tissue infections (SSTIs) are a major clinical burden, with community-acquired methicillin-resistant Staphylococcus aureus (CA-MRSA) increasingly implicated. These strains often combine resistance determinants with virulence factors such as Panton-Valentine leukocidin (PVL), complicating management. The aim is to investigate the clinical profile, microbiological spectrum, genotypic characteristics, and antibiotic resistance patterns of CA-SSTIs, with emphasis on CA-MRSA. Materials and Methods: A prospective cross-sectional study was conducted on 350 clinically diagnosed CA-SSTI cases over two years. Samples were processed for culture, phenotypic identification, antibiotic susceptibility testing, and PCR-based genotypic analysis. Results: The mean patient age was 37.9 years, with a slight male predominance (56%). Abscesses (46%) and cellulitis (34%) were the most common presentations. Culture positivity was 61%, with S. aureus as the predominant isolate (43%). MRSA accounted for 16% of cases. Genotypic analysis revealed mecA positivity in 86% and PVL gene presence in 79% of MRSA isolates. Antibiogram showed high resistance to ciprofloxacin (59%), gentamicin (59%), and erythromycin (53%), moderate resistance to clindamycin (33%) and tetracycline (34%), while all isolates remained sensitive to linezolid and vancomycin. Conclusion: CA-MRSA strains in this cohort demonstrated both multidrug resistance and high PVL prevalence, underscoring their dual threat of resistance and virulence. Linezolid and vancomycin remain reliable therapeutic options, while clindamycin may retain partial utility. Continuous molecular surveillance and antibiotic stewardship are essential to guide rational therapy and curb the spread of resistant, virulent clones. Keywords: Community-acquired SSTIs, MRSA.
Page No: 97-101 | Full Text
Original Research Article
CHIKUNGUNYA VIRUS INFECTION IN GONDIA, A TRIBAL DISTRICT IN MAHARASHTRA, INDIA: A DEMOGRAPHICAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.19
Khushbu s Sakure, Vivek Patil, Ravindra Khadse, Sunil Deshmukh
View Abstract
Background: Chikungunya virus infection continues to pose a significant public health challenge in India, with recurrent outbreaks and marked seasonal variation. District-level laboratory surveillance is essential to understand local transmission dynamics and to guide targeted vector control measures. The objective is to describe the demographic, temporal and spatial distribution of laboratory-confirmed chikungunya cases in Gondia district, Maharashtra, during 2024–2025 and to compare the month-wise trend with dengue during the same period. Materials and Methods: This study was conducted at the Department of Microbiology, Government Medical College, Gondia, which serves as a sentinel centre for vector born disease surveillance. Serum samples received from clinically suspected chikungunya cases during January 2024 to December 2025 were tested for chikungunya IgM-capture ELISA. Data on age, sex, month of reporting and taluka of residence were analysed. Descriptive statistics were used to assess trends and distribution patterns. Results: A total of 6,507 samples were tested using IgM ELISA during the study period, of which 181 were laboratory confirmed for chikungunya virus infection, giving an overall positivity of 2.8%. The positivity rate was higher in 2024 as compared to 2025. Males constituted 52% of cases, while females accounted for 48%, indicating near equal sex distribution. The majority of infections were observed among adults, particularly in the 21-30-year age group followed by 31-40. Month-wise analysis demonstrated distinct seasonal peaks, with a major increase during the monsoon and post-monsoon period and a smaller rise during early months of the year. Spatial analysis revealed marked clustering of cases in Gondia taluka (98 cases, 54.1%), followed by Sadak Arjuni, Goregaon and Amgaon, while comparatively fewer cases were reported from peripheral talukas. Conclusion: The present laboratory-based surveillance highlights clear spatio-temporal clustering and predominance of chikungunya infection among the adult population in Gondia district, with transmission peaks corresponding to favourable vector breeding seasons. Concentration of cases in specific talukas underscores the need for focused vector control and strengthened surveillance at sub-district level. Continuous laboratory surveillance remains crucial for early detection of transmission trends and for guiding district-specific public health interventions. Keywords: Chikungunya, surveillance, seasonality, Maharashtra, dengue.
Page No: 102-106 | Full Text
Original Research Article
A STUDY ON INCIDENCE OF POST MASTECTOMY PAIN AND PHANTOM BREAST SYNDROME FOLLOWING MASTECTOMY
http://dx.doi.org/10.70034/ijmedph.2026.2.20
David Salivendra, Ponnuru Chandi Priya, Jyeshavath Venkateswara Naik, Blessy Rani Medikonda
View Abstract
Background: Mastectomy is a commonly performed surgical procedure for the treatment of breast cancer. Although it is effective in controlling the disease, many patients experience postoperative complications such as post-mastectomy pain syndrome and phantom breast syndrome. These conditions may lead to persistent pain, discomfort, and psychological distress, thereby affecting the overall quality of life of patients. The aim is to determine the incidence of post-mastectomy pain and phantom breast syndrome in patients undergoing mastectomy for carcinoma breast. Materials and Methods: This observational study was conducted in the Department of General Surgery for a period of 2 years. A total of 100 patients who underwent mastectomy were included in the study. Patients above 35 years of age who underwent simple mastectomy or modified radical mastectomy were enrolled after obtaining informed consent. Patients were evaluated during follow-up visits approximately 8 to 12 weeks after surgery. A detailed clinical examination and a standardized questionnaire were used to assess the presence of postoperative pain and phantom breast symptoms. Data were entered in Microsoft Excel and analyzed using Epi Info and SPSS software. Results: Among the 100 patients studied, 67% underwent modified radical mastectomy and 33% underwent simple mastectomy. Post-mastectomy chest wall pain was observed in 21% of patients, while 11% experienced arm pain. Phantom breast pain was reported in 4% of patients, and phantom breast sensations were observed in 6% of patients. Most patients (66%) did not report any significant psychological impact following surgery. Conclusion: Post-mastectomy pain and phantom breast syndrome remain important postoperative complications, particularly following modified radical mastectomy. Early recognition, appropriate patient counseling, and improved surgical techniques may help reduce these complications and improve postoperative quality of life. Keywords: Breast cancer, Mastectomy, Post-mastectomy pain syndrome, Phantom breast pain, Phantom breast sensation.
Page No: 107-111 | Full Text
Original Research Article
PREVALENCE AND DETERMINANTS OF VACCINE HESITANCY AMONG ADULTS IN URBAN POPULATION
http://dx.doi.org/10.70034/ijmedph.2026.2.21
Samrat Prodhan, Suranjana Sarkar, Suman Kumar Roy, Arnab Ghosal, Ayan Ghosh
View Abstract
Background: Vaccine hesitancy has emerged as a major public health concern, even in urban populations with better access to healthcare services. It is influenced by multiple sociodemographic and behavioral factors, affecting vaccine uptake and the success of immunization programs. This study aimed to assess the prevalence and determinants of vaccine hesitancy among adults in an urban population. Materials and Methods: A community-based cross-sectional study was conducted among 351 adults in the urban field practice area of a tertiary care center in Kalyani, West Bengal. Participants were selected using multistage sampling. Data were collected using a pretested semi-structured questionnaire. Vaccine hesitancy was assessed based on the WHO SAGE framework. Statistical analysis included descriptive statistics, Chi-square test, and logistic regression. Results: The prevalence of vaccine hesitancy was 28.8%. Higher hesitancy was observed among younger adults, males, individuals with lower education, and those from lower socioeconomic status. Fear of side effects (57.4%), social media misinformation (46.5%), and doubts about vaccine efficacy (40.6%) were the most common reasons. Multivariate analysis revealed that primary education (AOR = 2.48) and lower socioeconomic status (AOR = 2.31) were significant independent predictors of vaccine hesitancy. Conclusion: Vaccine hesitancy remains a significant challenge in urban populations, driven by educational, socioeconomic, and informational factors. Targeted health education, improved communication strategies, and efforts to counter misinformation are essential to enhance vaccine acceptance and strengthen immunization programs. Keywords: Vaccine hesitancy; Socioeconomic factors; Health education; Immunization.
Page No: 112-117 | Full Text
Original Research Article
ASSOCIATION OF ELEVATED FIRST-TRIMESTER SERUM URIC ACID LEVELS WITH THE DEVELOPMENT OF GESTATIONAL DIABETES MELLITUS: A PROSPECTIVE OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.22
Mandala Jyothi, Rekha M
View Abstract
Background: Gestational diabetes mellitus (GDM) is one of the most common metabolic complications of pregnancy, leading to significant maternal and fetal morbidity. Early detection and intervention can substantially reduce adverse outcomes. Uric acid, a final product of purine metabolism, has been associated with insulin resistance, endothelial dysfunction, and oxidative stress. The present study was undertaken to evaluate whether elevated first-trimester serum uric-acid levels could serve as an early biochemical marker for predicting GDM. Materials and Methods: This prospective observational study was conducted in the Department of Obstetrics and Gynecology, St. Peter’s Medical College, Hospital and Research Institute, Krishnagiri, over a period from March 2025 to February 2026. A total of 150 antenatal women with gestational age < 14 weeks were enrolled. Serum uric-acid levels and fasting blood sugar were estimated at recruitment. Women were later screened for GDM between 24–28 weeks using a 75-gram oral glucose-tolerance test (OGTT) according to IADPSG/DIPSI criteria. Statistical analysis was performed using SPSS (version __). ROC-curve analysis was used to determine the optimal uric-acid cut-off for predicting GDM. Results: The mean age of participants was 23.8 ± 2.4 years, and the mean BMI was 22.0 ± 2.2 kg/m². The incidence of GDM was 14 % (21 of 150). Women with serum uric acid > 3.6 mg/dL in the first trimester had a significantly higher risk of developing GDM (P < 0.01). ROC analysis showed an AUC = 0.90 (SE = 0.05) with 91 % sensitivity and 98 % specificity, indicating excellent predictive performance. Serum uric acid correlated positively with GDM (r = 0.42, P < 0.01) and was independent of BMI and parity. Conclusion: An elevated first-trimester serum uric-acid level (> 3.6 mg/dL) is a strong and independent predictor of subsequent GDM. Because uric-acid testing is inexpensive, simple, and widely available, it may be incorporated as an early screening tool in antenatal care to identify high-risk women well before routine glucose testing. Larger multicentric studies are recommended to validate its predictive accuracy and integrate it into early pregnancy screening protocols. Keywords: Gestational diabetes mellitus, uric acid, first trimester, insulin resistance, early prediction.
Page No: 118-123 | Full Text
Original Research Article
STUDY OF HISTOMORPHOLOGICAL PATTERNS OF DCIS ASSOCIATED WITH INVASIVE DUCTAL CARCINOMA
http://dx.doi.org/10.70034/ijmedph.2026.2.23
Muchalapuri Sravani, Nunna Kiranmai, Akuthota Chaitanya
View Abstract
Background: Ductal carcinoma in situ (DCIS) frequently coexists with infiltrating duct carcinoma (IDC) of the breast and represents the non-invasive precursor component within the spectrum of breast carcinogenesis. Evaluating the histomorphological patterns of DCIS occurring alongside IDC provides important insights into tumor biology, aggressiveness, and potential prognostic implications. The aim is to study the histomorphological patterns of DCIS associated with IDC. Materials and Methods: Cross-sectional, descriptive study done in the department of Pathology, SVS Medical college, Mahbubnagar for duration of one-year ie from September 2024- september 2025. Results: DCIS was identified in the majority of IDC cases, with solid being the most common architectural pattern, followed by comedo and cribriform patterns. Mixed patterns were frequent, accounting for a substantial proportion of cases where more than one subtype of DCIS coexisted. High-grade DCIS showed a strong association with comedo necrosis and was more commonly present in tumors of higher histological grade. Cases with comedo or mixed architectural patterns demonstrated increased frequency of necrosis and a higher likelihood of lymphovascular invasion in the invasive component. No significant difference was observed in age distribution or tumor size in relation to DCIS subtype. Conclusion: DCIS accompanying IDC displays diverse architectural patterns, with solid and comedo types being most prevalent. High-grade nuclear features and comedo necrosis are more frequently associated with aggressive invasive carcinoma characteristics. Understanding the histomorphological spectrum of DCIS in association with IDC is essential, as it may provide prognostic information and help refine therapeutic decision-making. Keywords: Ductal carcinoma in situ (DCIS), Infiltrating duct carcinoma (IDC).
Page No: 124-129 | Full Text
Original Research Article
CLINICAL CORRELATION BETWEEN UNILATERAL PTERYGIUM AND DRY EYE
http://dx.doi.org/10.70034/ijmedph.2026.2.24
Balusupati Aalekhya, Farha Jabeen, Udayasree G, Mohammed Ather
View Abstract
Background: The aim is to study dry eye in cases of unilateral pterygium. The objective is to study the incidence of dry eye among the patients of unilateral pterygium. To find the clinical correlation between dry eye and unilateral pterygium. Results: This study demonstrates a significant association between pterygium and dry eye disease, as evidenced by measurable impairments in tear film stability and production. Using objective clinical parameters—Tear Break-Up Time (TBUT), Tear Meniscus Height (TMH), and Schirmer’s Tests I and II—we found that patients with pterygium consistently exhibited reduced values across all indicators when compared to controls. Among these, TBUT showed the strongest correlation with both the presence and severity of pterygium, underscoring the progressive nature of tear film instability as pterygium advances in grade.While TMH and Schirmer’s test values were also significantly lower in pterygium patients, their association with pterygium severity was less pronounced, likely due to sample size limitations and variability in tear secretion patterns. Nevertheless, these findings align with existing literature suggesting that pterygium contributes to both aqueous tear deficiency and tear film instability—key factors in the pathogenesis of dry eye disease. Conclusion: The present study highlights the clinical importance of routine screening for dry eye symptoms in patients with pterygium, particularly in moderate to severe cases. Early identification and management of tear film dysfunctionmay alleviate symptoms, enhance visual quality, and potentially reduce postoperative recurrence following pterygium excision. Future studies with larger cohorts and standardized diagnostic criteria are recommended to further validate and refine the association between pterygium and dry eye disease. Keywords: Pterygium, Dry eye, TBUT, TMH, Schirmer’s test.
Page No: 130-136 | Full Text
Original Research Article
A CASE CONTROL STUDY TO FIND OUT CHILD FEEDING PRACTICES RESPONSIBLE FOR SEVERE ACUTE MALNUTRITION AMONG UNDER FIVE YEARS OF CHILDREN AT VARANASI
http://dx.doi.org/10.70034/ijmedph.2026.2.25
Rituparna Ray
View Abstract
Background: Severe Acute Malnutrition (SAM) remains a major public health problem among under-five children in developing countries. Inappropriate child feeding practices play a crucial role in the development of malnutrition during early childhood. The aim is to identify and evaluate child feeding practices associated with Severe Acute Malnutrition among children under five years of age. Materials and Methods: This was a hospital-based case-control study conducted at the Nutritional Rehabilitation Centre of Pandit Deendayal Upadhyay Hospital, Pandeypur, Varanasi. The study was carried out over a period of 8 Months May 2025 to December 2025, at Pundit Deendayal Upadhyay Hospital, Varanasi. Results: Analysis of breastfeeding practices among the study participants revealed significant differences between cases and controls. Among the 50 cases with Severe Acute Malnutrition (SAM), only 20 (40%) children received early initiation of breastfeeding within one hour of birth, compared to 40 (80%) of the 50 controls (p<0.001), Exclusive breastfeeding for the first six months was observed in 18 (36%) cases and 35 (70%) controls (p<0.001), showing a significant protective effect of exclusive breastfeeding. Partial breastfeeding was more common among cases, reported in 25 (50%) children versus 12 (24%) controls (p=0.001). Children who were not breastfed at all included 7 (14%) cases and 3 (6%) controls, but this difference was not statistically significant (p=0.18). Conclusion: Suboptimal child feeding practices are major determinants of Severe Acute Malnutrition. Strengthening awareness and education regarding appropriate infant and young child feeding practices is essential to reduce the burden of malnutrition among under-five children. Keywords: Severe Acute Malnutrition, Under-five Children, Child Feeding Practices, Case-Control Study, Breastfeeding, Complementary Feeding, Dietary Diversity, Public Health.
Page No: 137-141 | Full Text
Original Research Article
CLINICAL SPECTRUM AND OUTCOMES OF ULTRASOUND-GUIDED INTERVENTIONS IN A TERTIARY CARE CENTER
http://dx.doi.org/10.70034/ijmedph.2026.2.26
Devidas Dahiphale, Shivaji Pole, Asmita Suryawanshi, Yash Tank
View Abstract
Background: Ultrasound (USG)-guided interventions have revolutionized minimally invasive diagnostic and therapeutic procedures by enabling real-time visualization, improving accuracy, and reducing complication rates. USG guidance improves overall clinical results, diagnostic yield, and procedural safety as compared to blind procedures. The purpose of this study was to assess the complication profile, technical success, clinical range, and diagnostic sufficiency of USG-guided treatments carried out in a tertiary care hospital. Materials and Methods: This retrospective observational study was conducted in the Department of Radiology at MGM Medical College, Chhatrapati Sambhajinagar, Maharashtra, over a one-month period (01 July 2025 to 31 July 2025). Included were 122 patients who had different diagnostic and therapeutic treatments guided by USG. Hospital records were used to gather information on demographics, procedure type, technical success, diagnostic yield, and complications. Results were evaluated in terms of complication rates, diagnostic sample adequacy, and procedure success. Results: Among 122 patients, 55.7% were males and 44.3% were females, with an age range of 1–82 years. Pleural tapping was the most commonly performed procedure (32.8%), followed by thyroid FNAC (13.9%), breast biopsy (9.8%), and lymph node FNAC (9.0%). Technical success was achieved in 99.2% of cases, with a diagnostic yield of 95.1%. No complications were observed in 93.5% of patients. Minor complications occurred in 6.5% of cases, predominantly localized pain (62.5%) and minor bleeding (37.5%). No major complications or mortality were reported. Conclusion: High technical success, superior diagnostic reliability, and low complication rates are all demonstrated by USG-guided procedures. According to these results, they are safe, efficient, and considered standard practices in tertiary healthcare settings. Keywords: Ultrasound-guided interventions, Minimally invasive procedures, Diagnostic yield, Technical success rate, Complication profile.
Page No: 142-148 | Full Text
Original Research Article
SURGICAL OUTCOMES OF FLEXOR HALLUCIS LONGUS AND PERONEUS BREVIS TENDON TRANSFERS IN NEGLECTED ACHILLES TENDON RUPTURES
http://dx.doi.org/10.70034/ijmedph.2026.2.27
Veluri Atchuta Ramaiah, Harish Kodi, Kada Venika, Padigapati Venkata Abhilash Reddy
View Abstract
Background: Neglected Achilles tendon ruptures, defined as untreated for over 3 weeks, lead to tendon retraction, muscle atrophy, and functional deficits like weak plantar flexion and gait instability. Surgical reconstruction via tendon transfers—flexor hallucis longus (FHL) or peroneus brevis (PB)—is standard when primary repair fails, but comparative outcomes remain underexplored. Materials and Methods: This prospective comparative study at a tertiary orthopedic department included 25 patients (13 FHL in Group A, 12 PB in Group B; aged 30-60 years) with chronic closed ruptures. FHL followed modified Wapner technique; PB used standard posterior/lateral harvest. Postoperatively, patients received equinus casting for 6 weeks, then progressive weight-bearing and physiotherapy. Outcomes at 6 months used Quigley scale, Leppilahti score, and AOFAS hindfoot-ankle score; complications were recorded. Results: Groups were balanced in age, sex, and injury-to-surgery time (3-5 weeks: 12 total; >5 weeks: 13 total). Group A excelled: Quigley excellent (6 vs 2), Leppilahti 91-100 (8 vs 2), AOFAS 91-100 (6 vs 2); all had good/excellent results vs 11/12 in B. Complications favored A: superficial infections (2 vs 4), no neurological issues (vs 3 in B); no reruptures. Conclusion: FHL transfer provides superior functional scores and fewer complications than PB for neglected Achilles ruptures, positioning it as the preferred option. Keywords: Achilles tendon rupture, flexor hallucis longus transfer, peroneus brevis transfer, tendon reconstruction, Quigley scale, AOFAS score.
Page No: 149-155 | Full Text
Original Research Article
A STUDY ON THE PREVELANCE OF LOW T3 SYNDROME IN HEART FAILURE WITH REDUCED EJECTION FRACTION AT NORTHERN RAILWAY CENTRAL HOSPITAL, NEW DELHI
http://dx.doi.org/10.70034/ijmedph.2026.2.28
Gopika krishnan B G, Celestina Dungdung, Sachin Madaan, Sanjay Joshi, Atul Gupta, Madhu Kaushal
View Abstract
Background: Heart failure (HF) is associated with high morbidity and mortality and is often accompanied by endocrine abnormalities, including thyroid dysfunction. Low T3 syndrome (LT3S), defined by reduced free triiodothyronine (FT3) with normal thyroid stimulating hormone (TSH) and free thyroxine (FT4), has been linked to adverse outcomes in HF. This study aimed to determine the prevalence of low T3 syndrome in patients with heart failure with reduced ejection fraction (HFrEF) and its association with clinical outcomes. Materials and Methods: This prospective observational study was conducted at Northern Railway Central Hospital, New Delhi over 18 months from July 2023 to December 2024. A total of 200 patients with HFrEF (LVEF <40%) were enrolled. Baseline clinical evaluation, thyroid function tests, and echocardiography were performed. Patients were categorized into low T3 syndrome and normal T3 groups and followed for six months to assess mortality, hospitalization, and major adverse cardiovascular events (MACE). Results: The mean age was 64.95 ± 12.99 years and 57.5% were males. Low T3 syndrome was observed in 40% of patients. Mortality at six months was significantly higher in the low T3 group compared to the normal T3 group (52.5% vs 16.67%, p<0.05). Hospitalization rates were also higher in the low T3 group (71.25% vs 49.17%, p<0.05). However, no significant association was found between low T3 syndrome and individual cardiovascular events. Conclusion: Low T3 syndrome is common in patients with HFrEF and is associated with increased mortality and hospitalization. Assessment of thyroid function may help identify high-risk patients. Keywords: Heart failure, Low T3 syndrome, HFrEF, Thyroid dysfunction, Mortality.
Page No: 156-160 | Full Text
Original Research Article
A COMPARATIVE STUDY OF INCIDENCE OF INCISIONAL HERNIA FOLLOWING EMERGENCY AND ELECTIVE SURGERIES
http://dx.doi.org/10.70034/ijmedph.2026.2.29
Ashfa Neelofer Mohammed, Ashwini Todewale
View Abstract
Background: Aim: To compare the occurrence of incisional hernia in emergency and elective surgeries in Shadan Institute of Medical Sciences. Objective: 1) To compare the incidence of incisional hernia in emergency and elective surgeries. 2) To analyse the factors contributing for the development of Incisional hernia in the patients who have undergone elective and emergency surgeries in Shadan Institute of Medical sciences. Materials and Methods: It was a case-control study conducted at Department of General Surgery, Shadan institute of medical sciences, Peerancheru, Hyderabad. Patients admitted in surgical wards of Shadan institute of Medical Sciences with incisional hernia with primary surgery done within 5 years and those who have undergone emergency and elective surgeries 5 years back. Results: In the present study, incisional hernia is more common after emergency surgeries as compared to elective surgeries. The significance of age as a risk factor for Incisional Hernia cannot be determined in the present study. Further study with greater sample size of effected as well as controlled groups is required to establish a causal relation. Incisional hernia is more common in males as compared to females, but male sex is not a risk factor for development of incisional hernia. Post-operative wound infections, diabetes mellitus, smoking, BMI>25, COPD can be considered as risk factors in development of incisional hernia. Conclusion: The present study concluded that Incidence of incisional hernia can be decreased by following proper suturing techniques, using appropriate suture material, observing sterile methods preoperatively and by achieving optimum glycaemic control. Keywords: Incisional Hernia, BMI, COPD, Glycemic control, Smoking.
Page No: 161-168 | Full Text
Original Research Article
DIAGNOSIS OF PLACENTA ACCRETA SPECTRUM IN POST-CAESAREAN PREGNANCY: A COMPARATIVE STUDY OF ULTRASOUND AND MAGNETIC RESONANCE IMAGING
http://dx.doi.org/10.70034/ijmedph.2026.2.30
Thendral G, Laishram Trinity Meetei, Laishram Deepak Kumar, Keisham Miranda Devi, Sheral Raina Tauro, Tamphasana Maimom
View Abstract
Background: Placenta accreta spectrum (PAS) is a life-threatening obstetric condition associated with abnormal placental invasion and rising caesarean section rates. Early diagnosis is essential for optimal management. This study evaluated the diagnostic performance of ultrasonography (USG) and magnetic resonance imaging (MRI) in post-caesarean pregnancies. Materials and Methods: A cross-sectional study was conducted over two years at RIMS, Imphal, including 84 pregnant women with prior caesarean section. All underwent USG, and MRI was performed in suspected cases. Histopathology was the gold standard. Data were analyzed using SPSS with chi-square test, t-test, and ROC analysis. Results: Most participants were aged 18–27 years (54%) with a mean age of 33.23 ± 5.91 years. Higher gravidity and multiple LSCS were significantly associated with PAS (p < 0.05). MRI showed higher sensitivity (65.3%) and specificity (77.3%) compared to USG (55.2% and 63.6%). ROC analysis indicated moderate diagnostic accuracy (MRI AUC = 0.614; USG AUC = 0.682). Conclusion: USG and MRI are effective for PAS diagnosis, with MRI showing better diagnostic performance. Multiparity and prior LSCS significantly increase PAS risk, highlighting the need for early detection and multidisciplinary management. Keywords: Placenta accreta spectrum, ultrasonography, MRI, caesarean section, multiparity, diagnostic accuracy.
Page No: 169-174 | Full Text
Original Research Article
COMPARATIVE STUDY OF EFFICACY, MOTOR, SENSORY BLOCKADE AND DURATION OF BLOCKADE OF INTRATHECAL HYPERBARIC LEVOBUPIVACAINE WITH FENTANYL AND HYPERBARIC ROPIVACAINE WITH FENTANYL IN LOWER ABDOMINAL AND LOWER EXTREMITY SURGERY
http://dx.doi.org/10.70034/ijmedph.2026.2.31
K. Deepa, Mangesh Suresh Gore, Anil Parde, Kadali Lakshmi Sudha
View Abstract
Background: Spinal anesthesia is widely used for lower abdominal and lower extremity surgeries due to its rapid onset, reliable sensory and motor blockade, and favorable safety profile. Newer local anesthetics such as levobupivacaine and ropivacaine are increasingly preferred because of their lower cardiotoxicity and neurotoxicity compared to bupivacaine. The addition of opioids like fentanyl as an adjuvant enhances the quality and duration of spinal anesthesia. Aim: To determine and compare the characteristics of subarachnoid block induced by hyperbaric levobupivacaine 0.5% with fentanyl and hyperbaric ropivacaine 0.75% with fentanyl in patients undergoing lower abdominal and lower extremity surgeries. Materials and Methods: This comparative study included 80 patients who were randomly divided into two groups of 40 each. Group L received intrathecal hyperbaric levobupivacaine 0.5% with fentanyl, while Group R received hyperbaric ropivacaine 0.75% with fentanyl. The onset and level of sensory blockade, duration of sensory and motor blockade, motor block intensity using the Modified Bromage Scale, duration of analgesia, hemodynamic parameters, and adverse effects were recorded and analyzed. Results: The mean age of patients was comparable between the groups (p = 0.63). Group L demonstrated significantly longer duration of sensory blockade (187 ± 16 min vs 134 ± 13.1 min, p < 0.001), longer motor block duration (145 ± 15.6 min vs 109 ± 9.44 min, p < 0.001), and prolonged analgesia compared to Group R. The onset of sensory block was faster in Group R (8.55 ± 2.80 min) than in Group L (10.7 ± 3.57 min, p = 0.004). Hemodynamic parameters were largely comparable between groups. Conclusion: Both drug combinations provided effective spinal anesthesia; however, levobupivacaine with fentanyl produced longer sensory and motor blockade, while ropivacaine with fentanyl allowed earlier recovery, making it suitable for shorter procedures. Keywords: Spinal anesthesia, Levobupivacaine, Ropivacaine, Fentanyl, Sensory blockade, Motor blockade.
Page No: 175-180 | Full Text
Original Research Article
COMPARATIVE STUDY OF EFFICACY BETWEEN FOLEY’S INDUCTION WITH MISOPROSTOL AND MIFEPRISTONE WITH MISOPROSTOL IN SECOND TRIEMESTER ABORTION
http://dx.doi.org/10.70034/ijmedph.2026.2.32
Rekha M, Mandala Jyothi
View Abstract
Background: Termination of pregnancy in the second trimester is associated with increased maternal risks. The MTP act, 1971 (amended in 2020) defines the ideal place and person who can conduct MTP. The safest, lowcost method of inducing MTP is preferred. This study was conducted to evaluate the efficacy and outcome of 2 medical methods of MTP , i.e., use of Foley’s bulb and misoprostol versus misoprostol with mifepristone. Materials and Methods: This observational study was conducted in the Department of Obstetrics and Gynaecology, St. Peter’s Medical College, Hospital and Research Institute, Krishnagiri, over 12 months period. A total of 150 patients in their second trimester eligible for MTP were included in this study. Patients were divided into 2 groups of 75 members each. Group A was induced with Foley’s bulb and misoprostol and Group B was induced with mifepristone and misoprostol. Results: most of the patients belonged to 20-30 years of age group. 86% of the study population was parous. Majority of the study patients were married and belonged to lower socioeconomic class. Most of the patients were in 17-20 weeks of gestational age. Request for MTP was the most common indication. The number of requirement of misoprostol doses was significantly decreasing with increase in the parity. The induction to abortion time interval was lower for Group B. patients of Group B had 100% success rate with minimal complications. Conclusion: Although Foley’s bulb with misoprostol is of low cost, but when compared to efficacy, it is inferior to mifepristone with misoprostol. Keywords: Abortion, second trimester, Foley’s, misoprostol, mifepristone.
Page No: 181-185 | Full Text
Original Research Article
EVALUATING COMPLIANCE WITH THE HOUR-1 SEPSIS BUNDLE AND THE IMPACT OF A RAPID RESPONSE INTERVENTION IN SUSPECTED SEPSIS PATIENTS”- A PROSPECTIVE INTERVENTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.33
Chodarapu Vedasri, Vijaya Kumar Punnapu, Yasoda Devi Kakaraparthi
View Abstract
Background: Sepsis remains a leading cause of preventable mortality worldwide, particularly in resource-limited settings where delayed recognition and inconsistent adherence to evidence-based guidelines are common. The Surviving Sepsis Campaign recommends implementation of the Hour-1 Sepsis Bundle to improve outcomes. However, compliance remains suboptimal in many tertiary-care hospitals. Objective: To evaluate compliance with the Hour-1 Sepsis Bundle in suspected sepsis patients and to assess the impact of a simple Rapid Response Card intervention on improving bundle adherence. Materials and Methods: This prospective interventional study was conducted over two months (October–December 2025) in the Department of General Medicine and ICUs of a tertiary-care teaching hospital in Visakhapatnam. Ninety-two adult patients with newly diagnosed sepsis (46 pre-intervention and 46 post-intervention) were included. Baseline compliance with individual Hour-1 bundle components—blood cultures, intravenous (IV) antibiotics, IV fluids, and vasopressors (when indicated)—was recorded. Serum lactate measurement was excluded due to non-availability. During the intervention phase, a bedside Sepsis Rapid Response Card was introduced to prompt timely action. Compliance rates and patient outcomes were compared using the Chi-square test, with p < 0.05 considered statistically significant. Results: IV antibiotic compliance improved significantly from 34.8% pre-intervention to 84.8% post-intervention (χ²=21.88, p<0.001). Complete Hour-1 bundle compliance increased from 13% to 32.6% (χ²=3.95, p=0.047). Blood culture collection improved from 15.2% to 30.4%, and IV fluid administration increased from 63% to 73.9%, though these changes were not statistically significant. Mortality decreased from 21.7% in the pre-intervention group to 13% post-intervention, representing a clinically meaningful reduction. Discharge rates increased from 71.7% to 82.6%. Conclusion: Implementation of a low-cost bedside Rapid Response Card significantly improved compliance with key components of the Hour-1 Sepsis Bundle, particularly timely IV antibiotic administration and overall bundle completion. Although full compliance remains suboptimal, structured, simple quality-improvement interventions can enhance early sepsis management and potentially improve clinical outcomes in resource-constrained settings. Keywords: Sepsis, Hour-1 bundle, Rapid response intervention, Antibiotic compliance, Quality improvement, Tertiary care hospital.
Page No: 186-194 | Full Text
Original Research Article
CARDIOVASCULAR RISK FACTOR BURDEN IN PATIENTS UNDERGOING CORONARY ARTERY BYPASS GRAFTING: A CLINICAL ANALYSIS
http://dx.doi.org/10.70034/ijmedph.2026.2.34
Neha Sharma, Lokendra Sharma, Dhruva Sharma, Neelima Sharma, Preksha Sharma, Uma Advani
View Abstract
Background: Coronary artery disease (CAD) remains one of the leading causes of morbidity and mortality worldwide. In patients with severe or advanced disease, coronary artery bypass grafting (CABG) is an established surgical procedure that helps restore adequate blood flow to the heart. Understanding the demographic characteristics and cardiovascular risk factor profile of patients undergoing CABG is important for improving clinical management and long-term outcomes. The objective is to evaluate the demographic profile, clinical parameters, and major cardiovascular risk factors among patients undergoing coronary artery bypass grafting. Materials and Methods: This observational descriptive study included 200 patients who underwent CABG at a tertiary care center. Data on demographic variables, clinical parameters, and comorbid conditions such as hypertension, diabetes mellitus, dyslipidemia, chronic kidney disease, and heart failure were collected. Continuous variables were expressed as mean ± standard deviation, while categorical variables were presented as frequency and percentage. Chi-square test, independent sample t-test and Pearson correlation were applied where appropriate, with a p-value <0.05 considered statistically significant. Results: The mean age of patients was 61.8 ± 10.7 years, and males constituted the majority of the study population (78%). Hypertension was the most common risk factor (55%), followed by diabetes mellitus (33.5%), hypercholesterolemia (24%), hypertriglyceridemia (21%), heart failure (18%), and chronic kidney disease (13.5%). A significant positive correlation was observed between age and systolic blood pressure (r = 0.28, p = 0.002) as well as between age and blood glucose levels (r = 0.21, p = 0.01). Conclusion: Patients undergoing CABG were predominantly older males with a high burden of cardiovascular risk factors, particularly hypertension and diabetes. Early identification and effective management of these modifiable risk factors may help reduce disease progression and improve clinical outcomes. Keywords: Coronary artery disease, Coronary artery bypass grafting, Cardiovascular risk factors, Hypertension, Diabetes mellitus, Dyslipidemia.
Page No: 195-199 | Full Text
Original Research Article
COMPARATIVE STUDY OF CLINICAL AND BIOCHEMICAL PROFILES IN ALCOHOLIC VS. NON-ALCOHOLIC STEATOHEPATITIS- RELATED CIRRHOSIS
http://dx.doi.org/10.70034/ijmedph.2026.2.35
Aman Arora, Shweta Arya
View Abstract
Background: Differentiating Alcoholic Steatohepatitis (ASH) from Non-Alcoholic Steatohepatitis (NASH) is challenging. This study evaluates the diagnostic utility of MRI, focusing on perivascular branching heterogeneity, in distinguishing ASH from NASH. Materials and Methods: This retrospective cohort study included 90 MRI exams from 60 NASH and 30 ASH patients, with both MRI and liver biopsy performed within 13 months. MRI findings were independently scored by two radiologists, and interclass correlation coefficients (ICC) and receiver operating characteristic (ROC) analysis were used to assess diagnostic accuracy. Results: The mean age of NASH and ASH patients was 60.5 ± 9.38 and 54.1 ± 11.48 years, respectively (p=0.012). Perivascular branching heterogeneity was observed in 63% of ASH patients and 30% of NASH patients (Reader 1). The ICC was 0.69 (0.46–0.82), and ROC analysis showed an area under the curve (AUC) of 0.69 for Reader 1 and 0.72 for Reader 2. The positive predictive value (PPV) for perivascular branching was 65% (Reader 1) and 67% (Reader 2). Conclusion: MRI, particularly perivascular branching heterogeneity, can aid in differentiating ASH from NASH. However, the moderate diagnostic accuracy suggests it should be used alongside clinical and biochemical data for more reliable diagnosis. Further studies with larger cohorts and advanced imaging are needed. Keywords: Steatohepatitis, Alcoholic Liver Disease, Non-Alcoholic Steatohepatitis (NASH), Liver Cirrhosis, Biochemical Profile, Clinical Profile.
Page No: 200-204 | Full Text
Original Research Article
RADIOLOGICAL AND LABORATORY CORRELATION OF PLASMA LEAKAGE IN DENGUE INFECTION USING CHEST AND ABDOMINAL ULTRASONOGRAPHY
http://dx.doi.org/10.70034/ijmedph.2026.2.36
Shweta Arya, Aman Arora
View Abstract
Background: Dengue Fever is a rapidly spreading vector-borne illness in tropical and subtropical regions. Plasma leakage due to increased vascular permeability is a key feature of severe dengue and may lead to complications such as pleural effusion, ascites, and shock. Early identification of plasma leakage is essential for timely clinical management. Ultrasonography of the chest and abdomen has emerged as a useful non-invasive modality for detecting early radiological evidence of plasma leakage. The objective is to evaluate the radiological and laboratory correlation of plasma leakage in dengue infection using chest and abdominal ultrasonography. Materials and Methods: This hospital-based observational study included 100 patients with serologically confirmed dengue infection. Demographic details, clinical findings, and laboratory parameters such as platelet count and hematocrit levels were recorded. All patients underwent thoracoabdominal ultrasonography to detect radiological signs of plasma leakage, including pleural effusion, ascites, gallbladder wall thickening, and pericardial effusion. The ultrasonographic findings were correlated with laboratory parameters. Statistical analysis was performed using appropriate tests, and a p-value <0.05 was considered statistically significant. Results: The mean age of the patients was 34.6 ± 11.2 years, with a male predominance (58% males). Ultrasonographic findings suggestive of plasma leakage were observed in 54% of patients. The most common findings were gallbladder wall thickening (38%), followed by ascites (32%) and pleural effusion (28%). Patients with ultrasonographic evidence of plasma leakage had significantly lower platelet counts (58,300 ± 18,200 cells/mm³) and higher hematocrit levels (44.2 ± 4.9%) compared to those without plasma leakage (p < 0.001). Conclusion: Chest and abdominal ultrasonography is a valuable, non-invasive tool for the early detection of plasma leakage in dengue infection. When combined with laboratory parameters, it can help identify patients at risk of severe disease and support timely clinical management. Keywords: Dengue fever, Plasma leakage, Ultrasonography, Pleural effusion, Hematocrit.
Page No: 205-210 | Full Text
Original Research Article
PREVALENCE, RISK FACTORS AND CLINICAL PROFILE OF CLOSTRIDIOIDES DIFFICILE INFECTION IN ANTIBIOTIC-ASSOCIATED DIARRHOEA: A PROSPECTIVE OBSERVATIONAL STUDY FROM NORTHERN KERALA
http://dx.doi.org/10.70034/ijmedph.2026.2.37
Bonitta Rachel Abraham, Sajaad Manzoor, Tinu Abraham Kuruvilla
View Abstract
Background: Clostridioides difficile infection (CDI) is a major cause of antibiotic-associated diarrhoea (AAD), particularly in elderly hospitalized patients. Indian data on CDI from semi-urban regions remain limited. Objectives: To determine the prevalence of CDI among patients with antibiotic-associated diarrhoea and to identify associated risk factors and clinical characteristics. Materials and Methods: This prospective observational study was conducted in a tertiary care hospital in Northern Kerala during a period of 15 months. A total of 125 hospitalized adult patients with diarrhoea following antibiotic exposure were included. Stool samples were tested for glutamate dehydrogenase antigen and Clostridioides difficile toxins A and B using a rapid immunoassay. Statistical analysis was performed using SPSS version 20.0. Results: The prevalence of CDI was 17.6%. CDI was more common among males (68.2%), with a mean age of 73.9 ± 10.02 years. Significant associations were observed with advanced age, exposure to broad-spectrum antibiotics—particularly glycopeptides, fluoroquinolones, carbapenems and lincosamides—acid suppressive therapy, steroid use and Ryle’s tube feeding. Both metronidazole and vancomycin were associated with favourable clinical outcomes. Conclusion: CDI is a significant cause of antibiotic-associated diarrhoea among elderly hospitalized patients. Early identification and judicious antibiotic use are essential to reduce morbidity. Keywords: Clostridioides difficile, antibiotic-associated diarrhoea, CDI, risk factors, India.
Page No: 211-214 | Full Text
Original Research Article
A STUDY ON PERCEIVED BARRIERS TO HEALTHY EATING IN HIGH SCHOOL ADOLESCENTS IN URBAN SRIKAKULAM
http://dx.doi.org/10.70034/ijmedph.2026.2.38
K.Sree Pardhvi Sarma, Y.Neelima
View Abstract
Background: Healthy eating in childhood is essential for growth and prevention of long-term Non communicable diseases. Children face unique barriers shaped by taste preferences, peers, family habits, and food marketing. Understanding these barriers helps in child-centred nutrition interventions. Objectives: • To identify the barriers to healthy eating in urban high school adolescents in Srikakulam. • To give evidence-based recommendations regarding healthy eating Materials and Methods: Descriptive cross-sectional study was conducted over 3 months among 550 high school students in urban Srikakulam. Using multistage sampling, five schools were selected randomly and students were chosen using simple random sampling. Data were collected through a self-administered questionnaire on sociodemographic factors and perceived barriers. Data were analysed in Excel and Epi info using descriptive statistics. Results: Majority of the participants (73.6%) were in the age group of 12-14 years and 60.4% of them were males. Guidance from parents on healthy food choices (mean score =4.28), Guidance from teachers on nutrition and healthy food choices (mean score=3.93), Consumption of atleast 3 meals a day (mean score=3.88), Complexity in understanding information on food labels (mean score=3.73), easy availability of unhealthy foods (samosas, chips) (mean score=3.62) were found to be main perceived barriers. Conclusion: Innovative behaviour change strategies are needed to inculcate healthy eating habits with engagement of parents and teachers. School curriculum needs to be modified to impart nutritional knowledge as well as motivation for behaviour change. Keywords: Adolescents, Healthy eating, Perceived barriers.
Page No: 215-220 | Full Text
Case Series
EVALUATING THE SAFETY OF INTRACRANIAL CAROTID ARTERY STENTING: OUR EXPERIENCE
http://dx.doi.org/10.70034/ijmedph.2026.2.39
Nidhi Roy, Ajay Sharma, Nakul Rathore, Akshay Pol1, Abu Shahma, Pandurang Barve, D.K. Tyagi
View Abstract
Intracranial atherosclerotic disease is a major cause of ischemic stroke in Asian populations, and while aggressive medical therapy remains the standard of care, a subset of patients with severe intracranial stenosis continues to experience recurrent ischemic events. This retrospective case series evaluates the safety and short-term outcomes of intracranial carotid artery stenting in five medically refractory patients with cavernous ICA stenosis, including four patients with right-sided disease presenting with recurrent TIAs and one patient with left-sided stenosis presenting with persistent headache and dizziness. All patients underwent detailed clinical and angiographic evaluation, followed by endovascular treatment using balloon angioplasty and/or intracranial stenting under standardized institutional protocols. The degree of stenosis ranged from 68% to 90%, and technical success was achieved in all cases with satisfactory luminal expansion and restoration of antegrade flow. No peri-procedural complications, including stroke, haemorrhage, or death, were observed. All patients demonstrated complete resolution of presenting ischemic or haemodynamic symptoms and remained free of recurrent transient ischemic attacks or stroke during 3–6 months of follow-up. These findings suggest that intracranial carotid artery stenting can be performed safely in carefully selected, medically refractory patients when undertaken in experienced centres, although larger prospective studies with longer follow-up are required to define its long-term role. Keywords: Intracranial atherosclerotic disease, Cavernous internal carotid artery, Intracranial carotid artery stenting, Endovascular treatment, Ischemic stroke.
Page No: 221-228 | Full Text
Original Research Article
IMPACT OF DIETARY AND LIFESTYLE MODIFICATION ON NUTRITIONAL STATUS IN CHILDREN WITH HEMOPHILIA- A LONGITUDINAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.40
C Karthikeyan, V Vahini, Dhiya.P.Reji
View Abstract
Background: Hemophilia is a hereditary bleeding disorder characterized by recurrent bleeding episodes, particularly into joints and muscles, leading to chronic arthropathy and functional limitation. With improved survival due to factor replacement therapy and prophylaxis, long-term issues such as nutritional status, physical activity, and quality of life have gained increasing importance. Children with hemophilia are vulnerable to both undernutrition and overweight/obesity due to reduced mobility, recurrent pain, fear of physical activity, and socioeconomic factors. Addressing dietary habits and lifestyle practices may play a crucial role in improving overall health outcomes in this population. Aim: To evaluate the impact of dietary and lifestyle modification on nutritional status among children with hemophilia in a tertiary care center. Materials and Methods: This longitudinal study included 60 children aged 0–18 years with hemophilia A or B registered at the Integrated Centre for Hemophilia and Hemoglobinopathy, Coimbatore Medical College Hospital. The study was conducted over 6 months (October 2024 to March 2025). Baseline anthropometric measurements were recorded, and nutritional status was assessed using WHO and IAP growth charts as appropriate for age. Individualized dietary counseling and lifestyle modifications, including physiotherapy-based strengthening and range-of-motion exercises, were provided. Follow-up assessments were conducted at the end of 6 months to evaluate changes in nutritional status. Clinical profile, treatment pattern, joint health score (HJHS), and social impact were also analyzed. Results: At baseline, 13.3% of children were overweight, 13.3% were underweight, and 11.7% were obese. After intervention, overweight reduced by 62.5% (p=0.001) and underweight reduced by 75% (p=0.036), while obesity remained unchanged (p=0.699). Nutritional abnormalities were more common in older children. The majority had severe hemophilia (75%) and were on prophylaxis (63.3%). Joint health improved in 60.5% of children receiving prophylaxis. Functional limitations and school absenteeism were common. Conclusion: Dietary and lifestyle modification significantly improved undernutrition and overweight among children with hemophilia, though obesity persisted. Integrating structured nutritional and physiotherapy interventions into comprehensive hemophilia care may enhance functional outcomes and overall well-being. Keywords: Hemophilia, Nutritional status, Dietary modification, Lifestyle intervention, Children.
Page No: 229-235 | Full Text
Original Research Article
SERUM VITAMIN D3 AND IGF-1 AS COMPOSITE MARKERS OF DISEASE SEVERITY AND BONE METABOLIC DYSFUNCTION IN LIVER CIRRHOSIS
http://dx.doi.org/10.70034/ijmedph.2026.2.41
Kamlesh Kumar Sharma, Vandana Sharma, Sandeep Nijhawan, Sandhya Mishra
View Abstract
Background: Liver cirrhosis is associated with multiple endocrinal abnormalities. Vitamin D3 and Insulin-like Growth Factor-1 (IGF-1) are both metabolised in the liver and are known to play key roles in bone metabolism. Their individual deficiency has been documented in cirrhosis, but their combined utility as composite markers of disease severity and bone metabolic dysfunction remains underexplored. Materials and Methods: A case-control study was conducted at SMS Medical College and Hospitals, Jaipur. Fifty diagnosed cirrhotic patients were enrolled as cases and 50 age- and sex-matched healthy subjects as controls. Serum Vitamin D3 and IGF-1 levels were measured and correlated with Child-Turcotte-Pugh (CTP) class and Model for End-Stage Liver Disease (MELD) score. Results: Serum Vitamin D3 was deficient in 84% of cirrhotic patients with mean levels of 18.30±14.38 ng/ml vs 40.85±23.63 ng/ml in controls (p<0.001). Serum IGF-1 was low in 58% of patients (68.16±60.14 vs 182.62±129.41 ng/ml; p<0.001). Both parameters declined progressively across CTP classes A, B and C: Vitamin D3 (89.2 → 23.65 → 7.56 ng/ml) and IGF-1 (128.67 → 81.68 → 43.36 ng/ml). Vitamin D3 showed the strongest negative correlation with disease severity (r = -0.803 with CTP; r = -0.816 with MELD). The composite deficiency of both markers was significantly associated with advanced disease (CTP Class C and MELD ≥15). Conclusion: Serum Vitamin D3 and IGF-1 are significantly depleted in liver cirrhosis and their levels correlate strongly with disease severity. The combined assessment of these two markers offers a clinically practical, non-invasive index of hepatic-bone metabolic dysfunction that tracks CTP and MELD progression. Routine supplementation and monitoring of Vitamin D3 and IGF-1 should be considered an integral part of cirrhosis management. Keywords: Liver Cirrhosis, Vitamin D3, IGF-1, CTP Score, MELD Score, Bone Metabolic Dysfunction, Disease Severity.
Page No: 236-240 | Full Text
Original Research Article
BURDEN OF NON-COMMUNICABLE DISEASE RISK FACTORS AND THEIR ASSOCIATION WITH LIFESTYLE PRACTICES AMONG ADULT POPULATION IN THE FIELD PRACTICE AREAS OF SINDHUDURG DISTRICT
http://dx.doi.org/10.70034/ijmedph.2026.2.42
Sachin Sharma, Santosh Gadadnavar
View Abstract
Background: Non-communicable diseases (NCDs) are the leading cause of morbidity and mortality in India, largely driven by modifiable behavioral and lifestyle risk factors. Early identification of these risk factors at the community level is essential for effective prevention and control. The present study was conducted to assess the burden of major NCD risk factors and their association with lifestyle practices among adults in the field practice areas of Sindhudurg district, Maharashtra. Objectives: 1. To estimate the prevalence of selected NCD risk factors among the adult population. 2. To assess lifestyle practices related to diet, physical activity, tobacco, and alcohol use. 3. To determine the association between lifestyle practices and NCD risk factors. Materials and Methods: A community-based cross-sectional study was conducted from January to June 2026 in the field practice areas of SSPM Medical College, including RHTC Pandur (population 3,250), RHTC Kasal (population 4,399), and UHTC Kudal (population 23,600), covering service areas of RH Malvan, SDH Kankavli, and RH Kudal. Adults aged ≥18 years residing in the study area for at least one year were included using multistage sampling. Data were collected using a pretested structured questionnaire based on the WHO STEPwise approach. Information on socio-demographic profile and lifestyle practices was obtained. Anthropometric measurements and blood pressure were recorded using standard protocols. Data were analyzed using appropriate descriptive statistics and Chi-square test, with p<0.05 considered statistically significant. Results: A total of 600 adults participated in the study. The prevalence of tobacco use, alcohol consumption, physical inactivity, and inadequate fruit and vegetable intake was 32.5%, 21.3%, 46.8%, and 72.4%, respectively. Overweight/obesity (BMI ≥25 kg/m²) was observed in 28.7% of participants, while 24.5% had hypertension and 11.2% reported known diabetes. Significant associations were found between physical inactivity and overweight/obesity (p<0.001), tobacco use and hypertension (p=0.02), and unhealthy diet with both obesity and diabetes (p<0.05). Clustering of two or more risk factors was observed in 41.6% of the study population. Conclusion: A high burden of behavioral and metabolic NCD risk factors was observed among adults in the field practice areas of Sindhudurg district, with significant associations between unhealthy lifestyle practices and major risk conditions. Community-based lifestyle modification strategies, regular screening, and strengthened primary health care interventions are urgently needed to reduce the future burden of NCDs. Keywords: Non-communicable diseases, Risk factors, Lifestyle practices, WHO STEPS, Community-based study, Hypertension, Obesity.
Page No: 241-246 | Full Text
Original Research Article
PREDICTIVE FACTORS AND DEVELOPMENT OF A NOMOGRAM FOR POSTOPERATIVE PANCREATIC FISTULA FOLLOWING PANCREATIC RESECTION: A PROSPECTIVE STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.43
Haritha Gorantla, Sanjay Kumar Gurjar, Vimal Kumar Mittal, Sitaram Yadav, Niranjan Kumar Gandhi, Sindhura Bukka
View Abstract
Background: Postoperative pancreatic fistula (POPF) remains a major contributor to morbidity following pancreatic resections. Identifying reliable predictors is essential for risk stratification and improved perioperative management. This study aimed to evaluate risk factors associated with POPF and develop a predictive nomogram. Materials and Methods: A prospective study was conducted on 140 patients undergoing pancreatic resection between June 2023 and November 2024 at a tertiary care center. Preoperative, intraoperative, and postoperative variables were analyzed. Logistic regression analysis was used to identify predictors of POPF. A nomogram was constructed and validated using receiver operating characteristic (ROC) analysis. Results: The overall incidence of POPF was 15.7%. No significant association was observed with most clinicopathological variables. Significant predictors included higher body mass index (BMI), increased postoperative white blood cell count, soft pancreatic texture, smaller pancreatic duct size, greater postoperative albumin decrease, and elevated drain amylase on postoperative day 1. The nomogram demonstrated excellent discrimination with an area under the curve (AUC) of 0.97. Conclusions: The proposed nomogram provides an effective tool for individualized prediction of POPF risk and may assist in optimizing perioperative strategies in pancreatic surgery. Keywords: Pancreatic fistula; Pancreatic resection; Risk factors; Nomogram; Albumin; Drain amylase.
Page No: 247-253 | Full Text
Original Research Article
PREVALENCE, PATTERN AND DETERMINANTS OF DRUG-RESISTANT TUBERCULOSIS AMONG ADULT TB PATIENTS REGISTERED AT SELECTED TUBERCULOSIS UNITS IN JHANSI DISTRICT
http://dx.doi.org/10.70034/ijmedph.2026.2.44
Vimal Arya, Mahendra Chouksey, Bharti
View Abstract
Background: Drug-resistant tuberculosis (DR-TB) poses a major challenge to tuberculosis control in India. Evidence regarding the burden and determinants of DR-TB at the sub-district level is limited. Objectives: To estimate the prevalence and identify the socio-clinical determinants of drug-resistant tuberculosis among adult patients with TB registered at selected tuberculosis units in the Jhansi district. Materials and Methods: A retrospective record-based analytical cross-sectional study was conducted among 300 adult patients with TB, registered at the Mauranipur and Bangra Tuberculosis Units between May 2024 and May 2025. Data regarding sociodemographic characteristics, clinical history, and drug susceptibility testing were extracted from the Nikshay portal records. Associations between determinants and DR-TB were assessed using the chi-square test and odds ratio, with p < 0.05 considered statistically significant. Results: The overall prevalence of drug-resistant tuberculosis was 26% (78/300). DR-TB was significantly higher among males (30.5%) compared to females (15.6%) (p < 0.001). Patients aged >50 years had a higher prevalence (32.9%) than younger patients (23.9%) (p = 0.028). Previous TB treatment was strongly associated with DR-TB (55% vs. 6.7%; OR = 17.5, p < 0.001). HIV infection and a history of smoking were also significantly associated with drug resistance. No significant differences were observed between the Mauranipur and Bangra tuberculosis units regarding the major determinants. Conclusion: A considerable burden of drug-resistant tuberculosis was observed among patients with TB in the study area. Previous TB treatment, HIV infection, smoking, and older age were important determinants. Strengthening early detection, adherence to treatment, and targeted interventions for high-risk groups are essential to control DR-TB in the region Keywords: Drug-resistant TB, MDR-TB, Jhansi, Risk factors, NTEP.
Page No: 254-258 | Full Text
Original Research Article
BONE MARROW INVOLVEMENT IN NON HODGKIN LYMPHOMA AT TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.45
Akula Anitha, C. Aruna, Puppala Srilatha, Akarsh, Annapoorna Sirisha, Triveni Bhopal
View Abstract
Background: Non-Hodgkin’s lymphomas (NHLs) represent a heterogeneous group of disorders that arise from the lymphoid system. NHL occurs in individuals at virtually all ages, but it’s uncommon in children. Low-grade NHL (Follicular lymphoma, and Chronic lymphocytic leukemia) are more common in the USA, whereas Diffuse Large B cell Lymphoma (DLBCL) is most common in India. Bone marrow aspiration and biopsy are an important component of staging workup in NHL. Materials and Methods: This is a Prospective study, done for a period of 24 months. Total number of cases studied are 50 cases of NHL, with bone marrow correlation. CD3 and CD 20 IHC was done to differentiate between B and T cell Lymphoma. Results: In the present study out of 50 cases, 22 cases had both Bone marrow aspiration and Bone marrow biopsy. Of these 22 cases, BMB showed positive infiltrations in 16 (72.7%) of cases, while the remaining 6 (27.3%) cases were negative for infiltration and showed normal bone marrow study.Of the 50 cases of NHL, there were 49 cases of B cell – NHL constituting to 98%, and only 1 case of T- NHL. CD3 and CD 20 IHC was done to differentiate between B and T cell Lymphoma. Out of these 49 cases of B-cell NHL, the major subtype was Chronic lymphocytic lymphoma. Conclusion: Bone marrow aspiration is a simple and rapid procedure and it is alternative to biopsy, and it is supported by its easier and earlier diagnostic availability in the study of NHL. B cell lymphoma were more frequent than T cell lymphoma with predominance of Chronic Lymphocytic Lymphoma and Para trabecular infiltrations in the present study. Keywords: Non Hodgkin lymphoma, Bone marrow aspiration and Bone marrow biopsy.
Page No: 259-265 | Full Text
Original Research Article
A PROSPECTIVE OBSERVATIONAL STUDY IN CHILDREN BETWEEN AGE GROUP OF 1 MONTH TO 12 YEARS IN MEASUREMENT OF SERUM FERRITIN, LIVER ENZYMES -SERUM GLUTAMIC OXALOACETIC TRANSAMINASE (SGOT), SERUM GLUTAMIC PYRUVIC TRANSAMINASE (SGPT)AS A PROGNOSTIC INDICATOR OF SEVERE DENGUE IN CHILDREN ADMITTED IN TERTIARY CARE CENTRE
http://dx.doi.org/10.70034/ijmedph.2026.2.46
Sakthipriya A.R, V.Vahini, Karthikeyan C
View Abstract
Background: Dengue is a major pediatric public health problem in tropical countries, with clinical presentation ranging from dengue with warning signs to severe dengue with shock and organ dysfunction. Early identification of children at risk for severe dengue is essential for timely monitoring and management. Serum ferritin and liver enzymes such as serum glutamic oxaloacetic transaminase (SGOT/AST) and serum glutamic pyruvic transaminase (SGPT/ALT) may serve as useful prognostic biomarkers, but their serial evaluation in hospitalized children remains clinically relevant. The aim is to assess the prognostic value of serum ferritin, SGOT, and SGPT in predicting severe dengue among children aged 1 month to 12 years admitted to a tertiary care centre. Materials and Methods: This was a prospective observational study conducted over 1 year in a tertiary care centre, including 86 children aged 1 month to 12 years with dengue (as per WHO 2023 criteria) and/or NS1Ag/IgM positivity. Children with severe anemia, alternate causes of thrombocytopenia, other confirmed febrile serologies, and chronic transfusion-dependent hemolytic anemia were excluded. Clinical history, examination findings, and investigations including CBC, ferritin, liver function tests, serum albumin, IgM dengue, USG abdomen, and chest X-ray were recorded. Serum ferritin, SGOT, SGPT, and albumin were measured on Day 1 and Day 3 of admission. Data were analyzed to compare dengue with warning signs (DWS) and severe dengue, and ROC analysis was performed to determine diagnostic cut-offs. Results: Of 86 children, 66 (76.7%) had DWS and 20 (23.3%) had severe dengue; no child had dengue without warning signs. DSS constituted 18 (20.93%) and DHF 2 (2.33%) cases. Altered sensorium (p=0.001), tender hepatomegaly (p=0.027), and clinical fluid accumulation (p=0.022) were significantly associated with severe dengue. Mean ferritin levels were significantly higher in severe dengue on Day 1 (719.46 ± 245.13 vs 378.66 ± 163.53 ng/mL; p=0.001) and Day 3 (567.71 ± 148.95 vs 433.29 ± 135.72 ng/mL; p=0.006). SGOT and SGPT were also significantly elevated in severe dengue on both days (p<0.05). On ROC analysis, Day 1 ferritin showed the best performance (cut-off 536 ng/mL, sensitivity 95.0%, specificity 78.5%, Youden index 0.735), outperforming SGOT, SGPT, and albumin. Conclusion: Serum ferritin is a strong prognostic biomarker for severe dengue in children, with SGOT and SGPT providing additional supportive value. Serial measurement of ferritin and liver enzymes, along with clinical assessment, can help in early risk stratification and close monitoring of pediatric dengue cases in tertiary care settings. Keywords: Dengue; Severe dengue; Serum ferritin; SGOT; SGPT.
Page No: 266-273 | Full Text
Original Research Article
EFFICACY OF FRACTIONAL CO2 LASER VS COMBINATION OF FRACTIONAL CO2 LASER WITH 50% TCA IN THE TREATMENT OF ATROPHIC ACNE SCARS - A COMPARATIVE STUDY IN A TERTIARY CARE CENTER
http://dx.doi.org/10.70034/ijmedph.2026.2.47
Chaduvula Jahnavi, Kalagarla Sravani, Penugonda N S Sulakshmi Raajitha, Pasagadugula Venkata Krishna Rao, Pemmaraju Ramana Murty
View Abstract
Background: Acne, the most common skin disease, is a disorder of pilosebaceous units that affects adolescents mainly. Scarring is one of the main sequelae of acne which causes significant psychosocial burden on affected individuals. Acne scars are classified into atrophic and hypertrophic scars, atrophic scars being the most common among the two. Even though many cosmetic procedures are available for treatment of acne scars individually, the review of literature revealed a very meagre number of studies pertaining to combination procedures. Hence the present study of combination therapy was taken up to evaluate the advantage of combined therapy over monotherapy in the treatment of atrophic acne scars. Materials and Methods: Patients attending the out-patient department of DVL from January 2024 to June 2025 presenting with facial atrophic acne scars were considered for this study. Written consent was taken. Details of symptoms, duration, site involved, type of scars, skin types, evolving dermatosis was recorded. Photographs were taken before and after completion of treatment. Qualitative grading is done before and after treatment. Patients meeting the inclusion criteria were randomly categorized into two treatment groups. Group A: Included only patients on Fractional CO2 laser. Group B: Included Fractional CO2 laser plus 50% TCA CROSS. Results: Forty patients were recruited after following the inclusion and exclusion criteria. The results were assessed based on 3 parameters-reduction in Goodman and Baron grade, patient assessment, patient satisfaction. There was a statistically significant difference in Goodman and Baron grade reduction, patient satisfaction between both groups. Conclusion: Fractional Co2 laser gives best results for superficial boxcar and rolling scars. TCA CROSS gives best results for ice pick and deep boxcar scars. Fractional CO2 laser followed by 50% TCA CROSS sequentially as a combination therapy has given excellent results with immense patient satisfaction. Keywords: Atrophic acne scars, Fractional CO₂ laser, Trichloroacetic acid (TCA), Combination therapy, Acne scar management, Skin resurfacing, Dermatological procedures.
Page No: 274-279 | Full Text
Review Article
AN OVERVIEW OF BURDEN OF DEATH DUE TO CONGENITAL ANOMALIES IN INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.48
Prerna Chandra, Neha Yadav, Vivek Mishra, Pradip Kharya, Oshin Verma
View Abstract
Background: Congenital anomalies are a significant but often overlooked cause of morbidity and mortality in India. These birth defects ranging from structural to functional abnormalities contribute notably to neonatal and under-five deaths, especially in low and middle income countries. Objective of this study is to assess the burden, mortality patterns and regional variations of congenital anomalies in India and to highlight actionable strategies for prevention, early detection and management. Materials and Methods: This review synthesizes national and regional data from published studies, reports from the World Health Organization and Government health surveillance documents. Mortality statistics by anomaly type and age group were analyzed using data from 2015 to 2021. Regional prevalence patterns were compiled from hospital-based studies conducted across different Indian states. Results: Among congenital anomalies, highest mortality observed in children under five years of age. Congenital heart anomalies were the leading cause, followed by neural tube defects and chromosomal disorders such as Down syndrome. Survival rates vary widely depending on the condition, with Down syndrome showing high survival and congenital heart diseases showing lower survival within the first year of life. Conclusion: Congenital anomalies remain a major public health concern in India. Strengthening antenatal screening, neonatal diagnosis, birth defect surveillance, and health worker training are critical to reducing the associated mortality and long-term disability. Public awareness, integrated care systems, and national level registries are essential to address the current gaps in prevention and management. Keywords: Congenital anomalies India, Congenital heart defects, Neural tube defects, Birth defect surveillance, Antenatal screening.
Page No: 280-286 | Full Text
Original Research Article
ANGIOGRAPHIC SPECTRUM OF INTRACRANIAL VASCULAR PATHOLOGIES ON CEREBRAL DIGITAL SUBTRACTION ANGIOGRAPHY IN A TERTIARY CARE CENTRE
http://dx.doi.org/10.70034/ijmedph.2026.2.49
Ishaan Arora, Shivaji Pole, D. B. Dahiphale, Asmita Pravin Suryawanshi
View Abstract
Background: Cerebrovascular diseases remain a leading cause of morbidity and mortality worldwide. Accurate imaging of intracranial vasculature is essential for diagnosis and management of these conditions. Digital subtraction angiography (DSA) remains the gold standard for evaluating intracranial vascular pathology due to its superior spatial resolution and dynamic assessment of cerebral circulation. Objective: To evaluate the angiographic spectrum of intracranial vascular pathologies detected on cerebral digital subtraction angiography in patients presenting to a tertiary care centre. Materials and Methods: This retrospective observational study included 34 patients who underwent cerebral digital subtraction angiography between January 2025 and June 2025 at a tertiary care centre. Demographic characteristics, clinical indications, and angiographic findings were obtained from archived angiography reports. Angiographic findings were categorized into intracranial arterial stenosis or occlusion, intracranial aneurysm, vertebrobasilar disease, arteriovenous malformation (AVM), and normal angiographic findings. Results: A total of 34 patients were included, comprising 20 males (58.8%) and 14 females (41.2%), with a mean age of 58 years. The most common indication for cerebral DSA was evaluation of ischemic stroke (55.9%), followed by subarachnoid hemorrhage (20.6%). Intracranial arterial stenosis or occlusion was the most common angiographic finding (52.9%), predominantly involving the internal carotid artery and middle cerebral artery. Intracranial aneurysms were detected in 17.6% of cases, vertebrobasilar disease in 14.7%, and arteriovenous malformation in 2.9%. Normal angiographic findings were observed in 11.8% of cases. Conclusion: Cerebral digital subtraction angiography remains an essential imaging modality for evaluation of intracranial vascular diseases. Intracranial arterial stenosis and occlusion were the most frequent angiographic findings in this study, highlighting the importance of DSA in the diagnosis and management of cerebrovascular disorders. Keywords: Digital subtraction angiography, intracranial vascular disease, cerebral angiography, aneurysm, stroke.
Page No: 287-294 | Full Text
Original Research Article
COMPARISON OF PROPOFOL VERSUS SEVOFLURANE ON POST-OPERATIVE COGNITIVE DYSFUNCTION IN GERIATRIC PATIENTS UNDERGOING GENERAL ANAESTHESIA
http://dx.doi.org/10.70034/ijmedph.2026.2.50
Jyotirmayee Patel, Kashish Ahuja, Anjali Singh
View Abstract
Background: Postoperative cognitive dysfunction (POCD) is a common complication in geriatric patients undergoing general anesthesia, leading to impaired memory, attention, and executive function. The choice of anesthetic agent may influence the incidence and severity of POCD. The aim is to compare the effects of propofol and sevoflurane on postoperative cognitive dysfunction in geriatric patients undergoing general anesthesia. Materials and Methods: This prospective comparative study included geriatric patients undergoing elective surgeries under general anesthesia. Patients were divided into two groups based on anesthetic technique: propofol-based total intravenous anesthesia and sevoflurane-based inhalational anesthesia. Cognitive function was assessed preoperatively and on postoperative day 7 using MMSE and MoCA scores. Results: Both groups demonstrated a decline in postoperative cognitive scores; however, the reduction was more significant in the sevoflurane group. The incidence of POCD was higher among patients receiving sevoflurane compared to propofol. Propofol was associated with faster recovery, reduced neuroinflammatory impact, and better preservation of cognitive function. Duration of anesthesia and advanced age showed a positive correlation with POCD incidence. Conclusion: Propofol appears to be associated with a lower incidence of postoperative cognitive dysfunction compared to sevoflurane in geriatric patients. Its neuroprotective, anti- inflammatory, and antioxidant properties may contribute to better cognitive outcomes. Careful anesthetic selection may help reduce POCD risk in elderly surgical patients. Keywords: Postoperative cognitive dysfunction, Propofol, Sevoflurane, Geriatric anesthesia, Cognitive function, General anesthesia.
Page No: 295-301 | Full Text
Original Research Article
ROLE OF ULTRASOUND AS THE PRIMARY IMAGING MODALITY AND COMPUTED TOMOGRAPHY IN SUSPECTED ACUTE APPENDICITIS: A RETROSPECTIVE STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.51
Sahithi Reddy Almela, Parthasarathi A, Gautham Muthu
View Abstract
Background: Acute appendicitis is a common surgical emergency in which delayed diagnosis increases the risk of perforation and sepsis, whereas inaccurate diagnosis may lead to negative appendicectomy. Imaging plays an important role when clinical findings are atypical. This study assessed the diagnostic performance of ultrasonography (USG) as the primary imaging modality and the additional value of computed tomography (CT) in selected patients with suspected acute appendicitis. Materials and Methods: This retrospective record-based study included 60 patients who underwent appendicectomy for suspected acute appendicitis over a 2-year period. Data were obtained from hospital electronic records, radiology archives, operative notes, and histopathology reports. USG was performed in all patients as the initial imaging test and categorized as positive, equivocal/non-diagnostic, or negative. CT was performed only in patients in whom USG was equivocal or in whom there was a persistent clinical suspicion despite negative USG. Final diagnosis was confirmed by operative findings and/or histopathology. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy were calculated. Results: Of the 60 patients, 34 (56.7%) were males and most patients were aged 11-30 years (56.7%). Appendicitis was confirmed in 52 patients (86.7%), while 8 (13.3%) represented negative appendicectomy. USG was positive in 38 (63.3%), equivocal/non-diagnostic in 10 (16.7%), and negative in 12 (20.0%) patients. When equivocal studies were considered non-positive, USG showed sensitivity of 69.2%, specificity of 75.0%, PPV of 94.7%, NPV of 27.3%, and diagnostic accuracy of 70.0%. CT was performed in 22 patients and demonstrated sensitivity of 94.4%, specificity of 75.0%, PPV of 94.4%, NPV of 75.0%, and diagnostic accuracy of 90.9%. Conclusion: Ultrasonography is a useful first-line investigation in suspected acute appendicitis because of its availability and absence of radiation, but its diagnostic performance is limited by operator dependence, occasional equivocal examinations, and moderate sensitivity. CT provides substantially higher diagnostic accuracy and is an effective problem-solving modality in unresolved cases. A staged USG-first, selective-CT approach appears practical and clinically appropriate. Keywords: Appendicitis, Ultrasonography, Computed Tomography, Sensitivity and Specificity, Appendectomy.
Page No: 302-308 | Full Text
Original Research Article
AUDIT OF REPEAT IMAGING REQUESTS AND FACTORS CONTRIBUTING TO REPEAT CT/MRI EXAMINATIONS: AN OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.52
Ravi Elango, Kalpana.R, Ajit Kumar, Annitha Elavarasi J
View Abstract
Background: Repeat CT and MRI examinations are frequently performed in tertiary care hospitals for clinical reassessment, follow-up, and treatment monitoring. While many repeat studies are appropriate and clinically necessary, a considerable proportion may be avoidable due to poor documentation, non-availability of prior imaging, or communication gaps between departments and institutions. Such repetition can increase patient burden, resource utilization, and workload in radiology services. Auditing repeat imaging requests is therefore important to identify patterns of use and factors contributing to unnecessary duplication. Aim: To audit repeat imaging requests and identify factors contributing to repeat CT and MRI examinations in a tertiary care hospital. Materials and Methods: This observational study was conducted in the Department of Radiology of a tertiary care hospital and included 150 patients who underwent repeat CT or MRI examinations. Data were collected from radiology request forms, imaging registers, hospital information systems, and picture archiving and communication system records using a structured proforma. Variables recorded included age, sex, imaging modality, patient status, source of referral, referring department, anatomical region examined, interval between previous and repeat imaging, availability of prior imaging, adequacy of clinical details, mention of prior imaging, reason for repeat examination, and justification status. Data were analyzed using SPSS version 27.0. Categorical variables were expressed as frequencies and percentages, and associations were tested using chi-square or Fisher’s exact test. A p-value of less than 0.05 was considered statistically significant. Results: Among the 150 patients, the majority belonged to the 31–50 years age group (38.67%), and males predominated (61.33%). CT constituted 64.00% of repeat imaging, while MRI accounted for 36.00%. Most repeat examinations were performed in inpatients (68.00%). Brain imaging was the most commonly repeated examination (30.67%), followed by abdomen and pelvis (25.33%). The most frequent interval between scans was 8–30 days (31.33%). Previous imaging was unavailable in 45.33% of cases, and clinical details were inadequate in 38.67%. Disease progression (26.67%) and follow-up/post-treatment evaluation (21.33%) were the commonest reasons for repeat imaging. Overall, 58.67% of repeat examinations were clinically justified, whereas 28.00% were potentially avoidable and 13.33% were likely unnecessary. Avoidable or unnecessary repeat imaging was significantly associated with non-availability of previous imaging (p<0.001), inadequate clinical details (p<0.001), and referral from other departments or institutions (p=0.003). Conclusion: Repeat CT and MRI examinations were common in this tertiary care setting and were influenced by both genuine clinical need and preventable system-related factors. Improved documentation, better access to prior imaging, and stronger interdepartmental communication may reduce unnecessary repeat imaging and enhance the quality of radiology services. Keywords: Repeat imaging; Computed tomography; Magnetic resonance imaging; Radiol
Page No: 309-316 | Full Text
Original Research Article
IMAGING PATTERNS OF MUSCULOSKELETAL INFECTIONS ON MRI/USG AND THEIR CLINICAL CORRELATION: AN OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.53
Ajit Kumar, Annitha Elavarasi J, Ravi Elango, Kalpana.R
View Abstract
Background: Musculoskeletal infections encompass a wide spectrum of conditions involving bones, joints, muscles, and soft tissues, which can lead to significant morbidity if not diagnosed early. Clinical presentation is often variable and nonspecific, making imaging modalities such as magnetic resonance imaging (MRI) and ultrasonography (USG) crucial for early detection, characterization, and management. Understanding imaging patterns and their correlation with clinical and laboratory findings is essential for improving diagnostic accuracy and guiding treatment. Aim: To evaluate the imaging patterns of musculoskeletal infections on MRI and USG and to correlate these findings with clinical presentation and laboratory parameters in patients presenting to a tertiary care hospital. Materials and Methods: This observational study included 50 patients with clinically suspected musculoskeletal infection who underwent MRI and/or USG. Clinical data including symptoms, examination findings, and comorbidities were recorded. Laboratory parameters such as total leukocyte count, erythrocyte sedimentation rate (ESR), C-reactive protein (CRP), and culture results were analyzed. Imaging findings on MRI and USG were assessed systematically for features such as soft tissue edema, abscess formation, joint effusion, marrow edema, cortical destruction, and deep fascial extension. Statistical analysis was performed using SPSS version 27.0, and correlation between imaging findings and clinical/laboratory parameters was evaluated. Results: The majority of patients were in the 21–40 years age group (38.00%) with male predominance (64.00%). Pain (92.00%), tenderness (82.00%), and swelling (78.00%) were the most common clinical features. Raised CRP (74.00%) and ESR (70.00%) were frequently observed. Lower limb involvement (48.00%) and osteomyelitis (28.00%) were the most common anatomical and diagnostic patterns, respectively. MRI demonstrated significant superiority in detecting marrow edema, cortical destruction, and deep fascial extension (p<0.05), whereas USG was comparable in identifying soft tissue edema, joint effusion, and abscesses. Abscess formation showed significant association with fever, elevated inflammatory markers, positive culture, and need for surgical intervention (p<0.05). Conclusion: MRI and USG are complementary modalities in the evaluation of musculoskeletal infections. MRI is superior for detecting deep and osseous involvement, while USG is valuable for superficial lesions and procedural guidance. Imaging findings, especially abscess formation, correlate well with clinical severity and laboratory parameters, aiding in timely diagnosis and management. Keywords: Musculoskeletal infections, Magnetic resonance imaging, Ultrasonography, Osteomyelitis, Clinical correlation.
Page No: 317-324 | Full Text
Original Research Article
COMPARISON OF ENDOBRONCHIAL CRYOBIOPSY AND CONVENTIONAL FORCEPS BIOPSY IN SUSPECTED LUNG CARCINOMA: A CROSS-SECTIONAL STUDY FROM A TERTIARY CARE CENTRE
http://dx.doi.org/10.70034/ijmedph.2026.2.54
Davinder Kumar Kundra, Vikas Jaiswal, Sachin Baliyan
View Abstract
Background: Accurate histopathological and molecular diagnosis is essential in the management of lung carcinoma, particularly in the era of targeted therapy and immunotherapy. Conventional endobronchial forceps biopsy, though widely used, often yields small and fragmented specimens with crush artifacts. Cryobiopsy has emerged as a promising alternative that may provide larger and better-preserved tissue samples. This study compared the diagnostic yield, specimen adequacy, and safety of cryobiopsy versus forceps biopsy in patients with suspected lung carcinoma presenting with visible endobronchial lesions. Materials and Methods: This hospital-based cross-sectional study included 76 adult patients undergoing flexible bronchoscopy for suspected lung carcinoma with visible endobronchial growth. All patients underwent both forceps biopsy and cryobiopsy during the same procedure. Primary outcome was diagnostic yield. Secondary outcomes included specimen size, adequacy for immunohistochemistry (IHC) and molecular testing, crush artifact, and procedure-related complications. Statistical analysis was performed using paired tests, with p < 0.05 considered significant. Results: Cryobiopsy demonstrated significantly higher diagnostic yield compared to forceps biopsy (88.2% vs 68.4%; p = 0.002). Mean specimen size was larger with cryobiopsy (6.8 ± 1.9 mm vs 3.2 ± 1.1 mm; p < 0.001), with significantly lower crush artifact (7.9% vs 42.1%; p < 0.001). Adequacy for IHC (85.5% vs 57.9%; p < 0.001) and molecular testing (81.6% vs 50.0%; p < 0.001) was superior with cryobiopsy. Bleeding was more frequent with cryobiopsy (39.5% vs 23.7%; p = 0.041), though predominantly mild and manageable. No major procedure-related morbidity or mortality was observed. Conclusion: Endobronchial cryobiopsy provides significantly higher diagnostic yield and superior molecular adequacy compared to conventional forceps biopsy, with an acceptable safety profile. Incorporation of cryobiopsy into routine bronchoscopic practice may enhance diagnostic precision and reduce the need for repeat procedures in lung carcinoma evaluation. Keywords: Cryobiopsy; Forceps biopsy; Lung carcinoma; Endobronchial lesion; Diagnostic yield.
Page No: 325-331 | Full Text
Original Research Article
ROTAVIRUS DIARRHOEA IN THE POST-VACCINATION PERIOD: CLINICAL PROFILE AND GENOTYPING STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.55
Naresh Jindal, Vibha Madan, Pooja, Shisher Agrawal
View Abstract
Background: Rotavirus remains a leading cause of acute gastroenteritis in children under five years of age, despite the introduction of rotavirus vaccination into national immunization programs. In the postvaccination period, understanding changes in clinical presentation, disease severity, and circulating rotavirus genotypes is essential for assessing vaccine impact and guiding public health strategies. Materials and Methods: This hospital-based observational study was conducted among children aged 6 weeks to 59 months presenting with acute diarrhoea at a tertiary care center during the postvaccination period. Demographic details, clinical features, vaccination status, and disease severity were recorded using a structured proforma. Stool samples were tested for Group A rotavirus antigen by enzyme-linked immunosorbent assay (ELISA). Rotavirus-positive samples underwent G and P genotyping using reverse transcription polymerase chain reaction (RT-PCR). Disease severity was assessed using the Vesikari Clinical Severity Score. Statistical analysis was performed to compare clinical and severity parameters between groups. Results: Out of 163 children with acute diarrhoea, 68 (41.7%) were rotavirus positive. Rotavirus positivity was significantly higher among partially vaccinated and unvaccinated children compared to fully vaccinated children (p = 0.031). Rotavirus-positive children had significantly higher rates of vomiting, frequent stools, dehydration, and hospital admission (p < 0.05). Severe disease (Vesikari score ≥11) was observed in 36.8% of rotavirus-positive cases compared to 18.9% of rotavirus-negative cases (p = 0.006). Among rotavirus-positive children, fully vaccinated children had significantly fewer severe cases and shorter hospital stays compared to partially vaccinated or unvaccinated children (p = 0.011 and p = 0.002, respectively). Genotype analysis revealed predominance of G9 (25.0%) and G12 (22.1%) genotypes, with P[8] as the most common P type (61.8%). G9P[8] and G12P[6] were the most frequent genotype combinations. Conclusion: Rotavirus continues to contribute substantially to acute diarrhoeal illness in young children in the postvaccination era. While vaccination does not completely prevent infection, it significantly reduces disease severity and hospitalization. The predominance of emerging genotypes such as G9 and G12 underscores the need for continued clinical and molecular surveillance to monitor vaccine impact and guide future immunization strategies. Keywords: Rotavirus; Acute gastroenteritis; Postvaccination period; Vesikari score; Childhood diarrhoea.
Page No: 332-338 | Full Text
Original Research Article
IMPLANT PROSTHODONTIC INTERFACE: A REVIEW OF THE IMPORTANCE OF ABUTMENT DESIGN AND MATERIALS
http://dx.doi.org/10.70034/ijmedph.2026.2.56
Gundabathina Sarala Devi, V Vamsi Krishna Reddy, T. Pavan Kumar, V. Dileep Nag
View Abstract
Dental implants have rapidly become recognized as a dependable and efficient solution for replacing lost teeth, enhancing functionality, aesthetics, and patient satisfaction. The implant-abutment interface serves as the crucial link between the implant fixture and the prosthetic restoration in implant prosthodontics. The design and material of the abutment are essential in determining the long-term efficacy of implant-supported restorations, as they influence mechanical stability, biological integration, and cosmetic outcomes. The abutment facilitates a solid peri-implant soft tissue seal, minimal micro-movement, and optimal load distribution. Suboptimal design and material choices can result in compromised aesthetics, marginal bone resorption, peri-implant irritation, and screw loosening. Consequently, to enhance implant functionality, it is crucial to comprehend the influence of abutment parameters. This review aims to critically evaluate the role of abutment materials and design in the implant prosthodontic interface. Titanium, zirconia, and hybrid composites are among the often-employed materials discussed, along with surface characteristics, emergence profile, and connection geometry. The biological and mechanical success of implants is significantly determined by their design and materials, as concluded in this review. The selection of materials influences biocompatibility, bacterial adherence, and aesthetic integration; nonetheless, conical connection designs and platform switching have shown improved stability and reduced crestal bone loss. Zirconia offers superior aesthetics in anterior regions; nonetheless, titanium's reliability and robustness establish it as the gold standard. New biomaterials and CAD/CAM technology enhance both clinical outcomes and personalization. To achieve optimal patient outcomes and ensure the longevity of implants, meticulous selection of abutment design and material is crucial. Keywords: Dental implants, Abutment design, Implant–abutment interface, Titanium, Zirconia, Biocompatibility.
Page No: 339-348 | Full Text
Case Series
FUNCTIONAL OUTCOME FOLLOWING INTERNAL FIXATION OF PROXIMAL HUMERUS SURGICAL NECK TWO- AND THREE-PART FRACTURES WITH POLARIS NAIL
http://dx.doi.org/10.70034/ijmedph.2026.2.57
Ramesh M, Manesh Chacko Philip, Ranju Raj
View Abstract
Background: Proximal humerus fractures are common injuries accounting for approximately 4–6% of all adult fractures. Displaced fractures involving the surgical neck may result in significant functional impairment if not managed appropriately. Intramedullary fixation has emerged as a minimally invasive alternative to plate fixation, offering stable fixation while preserving soft tissue and vascular supply. The Polaris intramedullary nail has been specifically designed for proximal humerus fractures and provides multiple proximal locking options to improve stability. The objective is to evaluate the clinical and radiological outcomes of Polaris intramedullary nailing in the management of displaced two-part and three-part surgical neck fractures of the proximal humerus. Materials and Methods: This prospective case series included 25 patients with displaced proximal humerus fractures treated with Polaris intramedullary nailing at a tertiary care orthopaedic centre over a period of two years. Patients with Neer two-part and three-part surgical neck fractures were included. Functional outcomes were assessed using the Constant–Murley shoulder score, and radiological union was evaluated using serial radiographs during follow-up. Statistical analysis was performed using IBM SPSS software version 26.0. Results: The mean patient age was 54.6 ± 12.8 years. The majority of patients were male (64%). Two-part fractures accounted for 60% of cases, while three-part fractures constituted 40%. Radiological union was achieved in 24 patients (96%), with most fractures healing within 10–12 weeks. The mean Constant–Murley score at final follow-up was 78.6 ± 9.4. Good to excellent functional outcomes were observed in 80% of patients. The mean forward flexion was 142.5° and mean abduction was 135.4°. Postoperative complications occurred in three patients (12%), including shoulder stiffness, screw irritation, and delayed union. No cases of deep infection, implant failure, or avascular necrosis were observed. Conclusion: Polaris intramedullary nailing provides reliable fixation for displaced two-part and three-part surgical neck fractures of the proximal humerus. The technique allows early mobilization and results in high union rates with satisfactory functional outcomes and a low complication rate. Intramedullary fixation may therefore be considered an effective surgical option for selected proximal humerus fractures. Keywords: Proximal humerus fracture; Intramedullary nailing; Polaris nail; Surgical neck fracture; Constant–Murley score; Shoulder fracture fixation.
Page No: 349-355 | Full Text
Original Research Article
KAP STUDY ON AI IN THE MEDICAL FIELD WITH SPECIAL FOCUS ON AI VS HUMAN DOCTOR
http://dx.doi.org/10.70034/ijmedph.2026.2.58
Shounak Gupta, Harshpreet Singh, Prachi Jain
View Abstract
Background: Artificial intelligence (AI) is swiftly changing healthcare; however, the effect of AI on healthcare is contingent on the knowledge, attitudes, and practices (KAP) of medical practitioners. We organized a cross-sectional KAP survey of medical students and doctors to understand their awareness and attitudes towards AI in the field of medicine, their perceptions about AI compared to human doctors. Previous research’s have indicated a large level of awareness and little action among doctors and students. Materials and Methods: A structured questionnaire was used to survey 178 medical students and 30 doctors in a tertiary care institute. Demographics, knowledge of AI (e.g. awareness and education), attitudes (e.g. confidence in AI applications, fears of replacement), practices (e.g. use of AI tools, training) were covered. Descriptive summarization of responses was done. Chi-square (significance at p<0.05) was used to test categorical comparisons (students vs doctors). Results: Nearly all participants had heard of AI: 99.4% of students and 100% of doctors reported awareness. However, formal AI education was low (18.2% of students vs 0% of doctors had AI in their curriculum). Around 86–87% in both groups knew of AI’s use in medicine. Use of AI tools differed (33.3% of doctors vs 60.0% of students reported occasional or frequent use for study purposes). Positive attitudes were common: e.g. 93.8% of students agreed that AI is beneficial in medical education and 90% of doctors felt AI would benefit patients. In contrast, fewer believed AI could replace human doctors (23.3% of doctors, 39.7% of students). Notably, more doctors than students expressed ethical or privacy concerns (86.7% vs 38.4%, p<0.001). Chi-square tests showed significant differences in key attitudes: doctors were significantly more likely to view AI as beneficial to patient care (90% vs 50.6%, p<0.001) and to express privacy/ethical concerns (86.7% vs 38.4%, p<0.001), while students were more optimistic about AI replacing traditional learning methods (39.7% vs 23.3%, p<0.001). Conclusion: Both medical students and physicians showed high awareness of AI and generally favorable attitudes toward its role in healthcare, but actual training and use remain limited. Students were enthusiastic about learning AI, whereas doctors emphasized potential ethical issues and the need for guided integration. Our findings underscore the need to incorporate AI education into medical training (as recommended in similar KAP studies) and to foster “human-AI” collaboration rather than see AI as a replacement for clinicians. Keywords: Artificial intelligence; medical education; knowledge, attitudes, and practices (KAP); physicians; medical students; AI vs doctor; healthcare technology.
Page No: 356-360 | Full Text
Case Report
ANGIOLEIOMYOMA OF UTERUS – A RARE BENIGN SMOOTH MUSCLE TUMOUR
http://dx.doi.org/10.70034/ijmedph.2026.2.59
Isha Pareek, Sudhamani S, Shreya Shah
View Abstract
Uterine angioleiomyoma (vascular leiomyoma) is an uncommon smooth muscle tumor derived from blood vessels and is almost exclusively reported in the subcutaneous soft tissues of extremities. When seen in the female genital tract, the lesion is very rare. The radiological and clinical characteristics that distinguish uterine angioleiomyoma from conventional leiomyoma are not well documented in literature. Herein, we describe a 32-year-old woman with complaints of menorrhagia and pelvic pain. Ultrasonography showed an intramural mass suggesting a leiomyoma. This mass was surgically excised through an open myomectomy procedure. The pathologic findings were characteristic of angioleiomyoma and the differential diagnosis with conventional leiomyoma is discussed. Angioleiomyoma should be considered a rare variant of uterine smooth muscle tumors due to its specific potential to cause heavy menstrual bleeding by virtue of its vascular nature and due to its increased risk of causing hemorrhage during surgical excision. Keywords: Angioleiomyoma; CD34; Leiomyoma; Menorrhagia; Myomectomy; Uterus; Vascular leiomyoma
Page No: 361-363 | Full Text
Original Research Article
EFFECT OF DURATION OF DISEASE AND GLYCEMIC CONTROL ON ATTENTION, EXECUTIVE FUNCTION AND VISUAL REACTION TIME IN TYPE 2 DIABETES MELLITUS
http://dx.doi.org/10.70034/ijmedph.2026.2.60
Santosh Mayannavar, Hardik Nagar, Nayan Mali
View Abstract
Background: The objective is to assess the effect of disease duration and glycemic control on attention, executive function, and visual reaction time in patients with T2DM. Materials and Methods: This cross-sectional study included 60 participants with T2DM, divided into two groups based on their Glycosylated Hemoglobin (HbA1c) levels: Group A (Good Glycemic Control, HbA1c < 7.0%, n=30) and Group B (Poor Glycemic Control, HbA1c ≥ 7.0%, n=30). Participants were further sub-grouped based on disease duration (< 5 years and ≥ 5 years). All participants were assessed using the Digit Letter Substitution Test (DLST) for attention, the Stroop Color and Word Test (SCWT) for executive function, and a computer-based Visual Reaction Time (VRT) test for processing speed. Statistical analysis was performed using Pearson's correlation coefficient and independent t-tests. Results: The group with poor glycemic control (Group B) demonstrated significantly poorer cognitive performance compared to Group A, evidenced by lower DLST scores (p < 0.01), higher Stroop interference scores (indicating poorer executive function, p < 0.01), and prolonged Visual Reaction Times (p < 0.001). Furthermore, a longer duration of disease (≥ 5 years) was significantly correlated with worse performance on all three cognitive parameters, independent of glycemic control. A significant positive correlation was found between HbA1c levels and VRT (r = 0.52, p < 0.001), while a negative correlation was observed between HbA1c and DLST scores (r = -0.45, p < 0.01). Conclusion: Both poor glycemic control and a longer duration of T2DM are independently and significantly associated with deficits in attention, executive function, and psychomotor speed. These findings underscore the importance of stringent, early glycemic management to potentially mitigate the risk of cognitive decline in the diabetic population. Keywords: Type 2 Diabetes Mellitus, Glycemic Control, HbA1c, Attention, Executive Function, Visual Reaction Time, Cognition.
Page No: 364-369 | Full Text
Original Research Article
HEALTH LITERACY AND PATIENT ACTIVATION AMONG ADULTS WITH MULTIMORBIDITY IN URBAN INDUSTRIAL INDIA: A CROSS-SECTIONAL STUDY FROM SURAT WITH DIRECT IMPLICATIONS FOR NURSE-LED CHRONIC DISEASE MANAGEMENT
http://dx.doi.org/10.70034/ijmedph.2026.2.61
Aarjukumari Ghoghari, Bhavesh Chauhan
View Abstract
Background: Non-communicable diseases impose an expanding burden on primary healthcare systems across urban India, particularly in industrial cities where large numbers of working-age migrants live with multiple co-existing chronic conditions. Surat, a major diamond-processing and textile manufacturing hub in South Gujarat, is home to hundreds of thousands of such workers. Despite the scale of this population and the complexity of its health needs, evidence on whether these individuals possess adequate knowledge, skill, or confidence to manage their own conditions — or whether the health information they receive is accessible to them — remains limited. This study was designed to measure health literacy and patient activation in adults with multimorbidity attending outpatient clinics across four Surat hospitals, to assess blood pressure control as a direct clinical outcome, and to identify the characteristics most strongly associated with poor self-management capacity. Materials and Methods: A cross-sectional observational study was conducted between September and December 2024 across four outpatient clinical settings in Surat: Sardar Multispeciality Hospital, Shraddha Multispeciality Hospital, SIMS Hospital, and Dhameliya Kidney Hospital. Adults aged 30 to 65 years with at least two confirmed ICD-10-defined chronic conditions were enrolled using consecutive sampling (n=287). Patient activation was measured with the PAM-13 (Cronbach’s alpha = 0.87); health literacy with the HLS-EU-Q16 (Cronbach’s alpha = 0.83); and medication adherence with the MMAS-8. Blood pressure was measured following triplicate protocol using a calibrated automated sphygmomanometer. Multivariable logistic regression with VIF diagnostics was used to identify independent predictors of poor self-management. A sensitivity analysis excluding obesity and anxiety/depression as sole qualifying conditions was performed. Reporting follows STROBE guidelines. Structured patient education and literacy-adapted communication strategies were applied during data collection to ensure patient comprehension of study instruments. Results: Mean PAM-13 score was 52.4 (SD 14.7; 95% CI 50.7–54.1). Over three in five participants (61.7%; 95% CI 56.0–67.1%) registered poor self-management capacity at PAM Level 1 or 2. Low or problematic health literacy affected 68.3% of the sample (95% CI 62.7–73.5%). Among 205 hypertensive participants, 120 (58.5%) had uncontrolled blood pressure. Only 23.6% of those on prescribed medication showed high MMAS-8 adherence. In the adjusted regression model, low health literacy carried the strongest association with poor self-management (aOR 3.84; 95% CI 2.41–6.12), followed by low education (aOR 2.67), unskilled occupation (aOR 2.19), and three or more concurrent conditions (aOR 2.02). All VIFs were below 2.3; Nagelkerke R²=0.38. Findings were robust in sensitivity analysis. No participant had ever accessed a nurse-led chronic disease service; only 18.1% had received any structured self-management education. Conclusion: Low health literacy, poor patient activation, and inadequate medication adherence are common in this population and are associated with objectively
Page No: 370-379 | Full Text
Original Research Article
PROSPECTIVE OBSERVATIONAL STUDY ON SOCIOECONOMIC STATUS AND DIETARY FACTORS RESPONSIBLE FOR NON-INITIATION OR DELAYED INITIATION OF LACTATION
http://dx.doi.org/10.70034/ijmedph.2026.2.62
Agnimita Giri Sarkar, Apurpa Ghosh, Surupa Basu
View Abstract
Background: Lactogenesis II is a hormonally regulated process dependent on coordinated prolactin, insulin, cortisol, and progesterone withdrawal. Increasing evidence suggests that metabolic dysfunction may impair secretory activation despite adequate nutritional intake. Delayed initiation of lactation remains common in India, yet the endocrine–metabolic determinants underlying this phenomenon are insufficiently explored. This study aimed to evaluate the association of socio-economic status (SES) and dietary factors with delayed or non-initiation of lactation, with emphasis on potential endocrine–metabolic mechanisms. Materials and Methods: This analytical cross-sectional study included 400 postpartum mothers (within six months of delivery), comprising 200 cases (delayed/non-initiation of lactation) and 200 controls (normal initiation). SES was classified using the Modified Kuppuswamy Scale 2023. Dietary pattern was categorized as vegetarian or non-vegetarian, and daily caloric intake was estimated using structured dietary recall. Independent variables included maternal age, parity, education, mode of delivery, and antenatal care utilization. Continuous variables were analyzed using independent t-tests and categorical variables using Chi-square tests. A p-value ≤ 0.05 was considered statistically significant. Results: Mothers with delayed lactation were significantly older than controls (28.63 ± 4.67 vs. 24.83 ± 4.08 years; p < 0.0001). Mean caloric intake was significantly higher in the delayed group (2723.77 ± 456.68 kcal) compared to the normal group (1852.99 ± 320.40 kcal; p < 0.0001). A strong association was observed between SES and lactation status (p < 0.001), with delayed lactation more prevalent among upper-middle and lower-middle socio-economic strata. Conclusion: Delayed initiation of lactation is significantly associated with advancing maternal age, higher caloric intake, and middle SES. The findings support the hypothesis that endocrine–metabolic dysregulation—particularly impaired insulin sensitivity and altered prolactin responsiveness—may underlie delayed lactogenesis. In socio-economically transitioning populations, metabolic risk factors may exert greater influence on breastfeeding initiation than caloric insufficiency. Integrating metabolic screening with structured lactation support in accordance with World Health Organization recommendations may improve early breastfeeding outcomes and neonatal health indicators. Keywords: Delayed lactation, Lactogenesis II, Socioeconomic status, Caloric intake, Insulin resistance, Endocrine dysfunction, Maternal age, Metabolic imbalance, Breastfeeding initiation.
Page No: 380-384 | Full Text
Original Research Article
VOICES FROM THE FRONTLINE: EXPLORING HEALTH WORKERS’ EXPERIENCES IN NEPAL’S LYMPHATIC FILARIASIS MASS DRUG ADMINISTRATION PROGRAM
http://dx.doi.org/10.70034/ijmedph.2026.2.63
Achut Babu Ojha, Nafisul Hasan, Damaru Prasad Paneru, Sagar Parajuli, Sudip Raj Khatiwada, Md Makabul Rain, Sanjeev Kumar, Bibek Paudel
View Abstract
Background: Lymphatic filariasis (LF) remains a major public health challenge in Nepal. While Mass Drug Administration (MDA) is the cornerstone of elimination, the perspectives and lived experiences of health workers—those who implement the program—are often overlooked. Materials and Methods: This study employed a qualitative dominant mixed methods design, with emphasis on focus group discussions (FGDs) among health workers in three endemic districts. Thematic analysis was conducted using Braun and Clarke’s six step framework. Quantitative survey data (n=257) provided contextual support but the primary lens was qualitative. Results: Five overarching themes emerged: (1) health workers as strategic implementers, (2) contextually competent resources, (3) community mediators, (4) compliance facilitators, and (5) program optimizers. Health workers described challenges such as fear of side effects, rumors, inadequate supervision, and logistical constraints, but also highlighted their adaptive strategies, community trust building, and commitment to elimination goals. Conclusion: Health workers are not passive conduits of policy but active agents shaping program outcomes. Their narratives reveal both systemic barriers and opportunities for strengthening LF elimination. Policies must prioritize training, supportive supervision, incentives, and community engagement strategies that leverage health workers’ unique position at the frontline. Keywords: Lymphatic Filariasis, Mass Drug Administration, Qualitative Research, Health Workers, Nepal, Community Engagement.
Page No: 385-387 | Full Text
Original Research Article
HPLC-BASED EVALUATION OF HEMOGLOBINOPATHIES IN PEDIATRIC POPULATION: A RETROSPECTIVE CROSS-SECTIONAL STUDY FROM A TERTIARY CARE CENTER IN SOUTH INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.64
Komati Poornima, Arshiya Anjum, Shirish Kumar Patewar
View Abstract
Background: Hemoglobinopathies are among the most common inherited disorders worldwide and contribute significantly to pediatric morbidity, particularly in developing countries like India. Early detection using reliable diagnostic modalities is essential for timely intervention and genetic counseling. The objective is to evaluate the prevalence and pattern of hemoglobinopathies in pediatric patients using High-Performance Liquid Chromatography (HPLC) and to analyze their hematological characteristics. Materials and Methods: This retrospective cross-sectional study included 150 pediatric patients (0–14 years) over a period of 12 months (January–December 2025). Blood samples collected in EDTA were analyzed using an automated hematology analyzer for red cell indices and BIORAD D-10 HPLC system for hemoglobin variant detection. Cases were categorized based on hemoglobin fractions and retention times. Results: Out of 150 cases, 109 (72.7%) showed normal hemoglobin pattern, while 36 (24.0%) had hemoglobinopathies and 5 (3.3%) were indeterminate. Among abnormal cases: Sickle cell disease (HbSS): 11 cases (7.3%), Beta thalassemia trait: 7 cases (4.7%), Sickle cell trait: 6 cases (4%) & Compound heterozygous states: 6 cases (4%). A slight male predominance (52.7%) was observed. The most affected age group was 6–10 years (38.9%). Approximately 80% of cases showed microcytic hypochromic indices. Conclusion: Hemoglobinopathies, particularly sickle cell disease and beta thalassemia trait, are prevalent in the pediatric population. HPLC is a reliable and effective tool for screening and diagnosis. Early detection combined with genetic counseling and preventive strategies is essential to reduce disease burden. Keywords: Hemoglobinopathies, Sickle Cell Disease, Beta Thalassemia Trait, High-Performance Liquid Chromatography (HPLC), Pediatric Anemia, Hemoglobin Variants, Red Cell Indices, Screening.
Page No: 388-394 | Full Text
Original Research Article
COMPARISON BETWEEN 5% EMLA CREAM AND LIDOCAINE JELLY ON POST OPERATIVE SORE THROAT FOLLOWING GENERAL ANAESTHESIA WITH ENDOTRACHEAL INTUBATION - A PROSPECTIVE RANDOMIZED DOUBLE BLIND CONTROLLED STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.65
Yerraboina Madhavi, Sujani Gooty, Sunkesula Sonia Begum, Saya Raghavendra prasad, Chiruvella Sunil
View Abstract
Background: Postoperative sore throat (POST) is the most common postoperative outcome after endotracheal intubation. In this study we compared the efficacy of 5% EMLA cream and 2% Lidocaine jelly on incidence, severity of post operative sore throat, hoarseness of voice (HOV), post-extubation cough (PEC) following general anaesthesia with endotracheal intubation. Materials and Methods: This study includes 90 patients of 18 to 65 years old, with American Society of Anesthesiologists(ASA) physical status I and II and of either sex were scheduled to receive 5% EMLA cream or 2% Lidocaine jelly applied over the ET Tube cuff. POST was graded as none (0), mild (1), moderate (2), severe (3). A score of ≥2 was considered as significant POST. The incidence and severity of POST at the sixth post-operative hour was the primary outcome. Secondary outcomes included the incidence of POST, HOV and PEC at zero, first, and 24 hours. Results: The incidence of postoperative sore throat of any grade at 1st and 6th hour of postoperative period was significantly less in EMLA group when compared to Lidocaine group (35.6 % vs 60%, p - 0.0202; 15.6% vs 35.6%, p - 0.0296 respectively). The incidence of HOV was statistically significant at sixth postoperative hour (6.6% and 22.2%, p - 0.0358). The incidence of PEC was similar in both the groups at all times. Conclusion: 5% EMLA cream on the ETT cuff effectively reduced the incidence and severity of POST, HOV, PEC in the initial 6 hours of surgery compared to the 2% Lidocaine jelly group. Keywords: Postoperative sore throat (POST), 5 % EMLA, 2% Lidocaine jelly, hoarseness of voice, post extubation cough.
Page No: 395-400 | Full Text
Original Research Article
FACTORS INFLUENCING SURVIVAL AND SHORT-TERM OUTCOMES OF VERY LOW BIRTH WEIGHT INFANTS IN A TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.66
Paramjeet
View Abstract
Background: Very low birth weight (VLBW) infants remain at high risk of mortality and morbidity, particularly in developing countries. Despite advances in neonatal care, outcomes vary widely depending on maternal, perinatal, and neonatal factors. Identifying these factors is essential for improving survival and short-term outcomes. The objective is to evaluate the factors influencing survival and short-term outcomes among very low birth weight infants admitted to a tertiary care hospital. Materials and Methods: This retrospective observational study was conducted in the Neonatal Intensive Care Unit (NICU) of a tertiary care hospital. All neonates with a birth weight between 500 and ≤1500 grams, born between 1 January 2022 and 31 December 2023, and admitted within 24 hours of birth were included. Data on maternal, perinatal, and neonatal variables were collected from medical records. The primary outcome was survival to discharge, while secondary outcomes included major neonatal morbidities. Statistical analysis was performed using appropriate tests, with p < 0.05 considered significant. Results: A total of 1,018 VLBW infants were included, of whom 664 (65.2%) survived and 354 (34.8%) died. Survival was significantly higher among infants with adequate antenatal care, exposure to antenatal corticosteroids, and cesarean delivery. Higher birth weight, gestational age, head circumference, and Apgar scores were significantly associated with improved survival (p < 0.001)002E Conclusion: Survival of VLBW infants is influenced by a complex interplay of antenatal, perinatal, and neonatal factors. Strengthening antenatal care, ensuring timely interventions, and improving neonatal intensive care practices are crucial to enhancing survival and reducing morbidity in this vulnerable population. Keywords: Very low birth weight, neonatal mortality, NICU, antenatal care, prematurity, neonatal outcomes
Page No: 401-406 | Full Text
Original Research Article
SEVERE MINERAL BONE DISEASE IN A CHILD WITH A SOLITARY KIDNEY AND NEGLECTED END-STAGE RENAL DISEASE ON MAINTENANCE HAEMODIALYSIS
http://dx.doi.org/10.70034/ijmedph.2026.2.67
Sivaani.U, Ramya.N, Meyyan.A, Punitha.J , Karuppiah.K, Ganavi. Ramagopal
View Abstract
Chronic kidney disease–mineral and bone disorder (CKD-MBD) represents a systemic complication of advanced renal dysfunction characterized by abnormalities in calcium, phosphate, vitamin D metabolism, and secondary hyperparathyroidism leading to skeletal and cardiovascular complications. We report a neglected pediatric case of end-stage renal disease (ESRD) with a solitary kidney that developed severe mineral bone disease due to prolonged absence of follow-up and lack of supplementation. A male child in late childhood with a history of vesicoureteral reflux–related renal damage and prior nephrectomy presented with progressive severe bone pain, immobility, and reduced urine output. The child had been advised dialysis several years earlier, but was lost to follow-up and had not received mineral or vitamin D supplementation. Clinical evaluation suggested advanced CKD with severe skeletal involvement consistent with CKD-MBD. The patient was initiated on maintenance Hemodialysis with supportive metabolic management. This case highlights the devastating skeletal consequences of untreated CKD-MBD in children and emphasizes the importance of early detection of reflux nephropathy, long-term nephrology follow-up, and strict metabolic monitoring in pediatric CKD patients. Keywords: CKD–mineral bone disorder, End stage renal disease, Solitary kidney, Hemodialysis Pediatric renal disease.
Page No: 407-410 | Full Text
Original Research Article
EVALUATING PROTEINURIA'S PROGNOSTIC VALUE IN CHILDHOOD DENGUE INFECTIONS IN A TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.68
Khizerulla Sharief, Venkatesh K S, Veeralokanadha Reddy M
View Abstract
Background: Dengue is a major public health problem in tropical countries, with children being particularly vulnerable to severe forms of the disease. Early identification of prognostic markers is essential in tertiary care settings to reduce morbidity and mortality. Proteinuria, resulting from endothelial dysfunction and plasma leakage, has emerged as a potential indicator of disease severity in dengue infection. Objectives: To correlate urine protein creatinine ratio (UPCR) with severity of illness in children diagnosed with dengue fever. To assess whether proteinuria can serve as a predictor of progression to severe dengue. Materials and Methods: A hospital-based prospective observational study was conducted in the Department of Pediatrics over a period of one year. A total of 140 children aged 1 month to 12 years with laboratory-confirmed dengue infection were included. Daily early morning urine samples were collected from day 3 to day 9 of fever or until discharge. Proteinuria was assessed using urine dipstick and quantified by urine protein creatinine ratio (UPCR). Patients were categorized as probable dengue, dengue with warning signs, and severe dengue as per WHO criteria. Data were analyzed using SPSS version 19.0. Pearson Chi-square test and ANOVA were applied; p < 0.05 was considered statistically significant. Results: Of 140 children, 55% had probable dengue, 24.2% had dengue with warning signs, and 20.7% had severe dengue. Severe dengue accounted for 38% mortality within its group. Both dipstick proteinuria and UPCR values were significantly higher in severe dengue compared to other groups (p < 0.05). Proteinuria peaked between days 5–6 of illness and declined during recovery. Expired patients demonstrated markedly elevated UPCR and dipstick values compared to survivors. No significant association was observed between platelet count and UPCR. Conclusion: Proteinuria, particularly quantified by UPCR, shows significant correlation with dengue severity in children. Serial monitoring of UPCR may serve as a useful prognostic marker for early identification of severe disease and guide clinical management in tertiary care settings. Keywords: Dengue; Proteinuria; Urine Protein Creatinine Ratio; Pediatric Dengue; Severe Dengue; Prognostic Marker; Renal Involvement.
Page No: 411-416 | Full Text
Systematic Review
ROLE OF HISTOPATHOLOGY IN FORENSIC DIAGNOSIS OF SUDDEN CARDIAC DEATH: A SYSTEMATIC REVIEW AND META-ANALYSIS
http://dx.doi.org/10.70034/ijmedph.2026.2.69
Pradeep Kumar Nayak, Sudhanshu Sekhar Sethi, Ajith Antony, Priyatosh Dash, Debdas Samantaray, Mahendra Singh, Pratyush Mishra
View Abstract
Background: Sudden cardiac death (SCD) is one of the leading causes of natural deaths as seen in forensic practice, and it is very often a big puzzle for the pathologist to solve because the heart may not show any significant changes at autopsy. Histopathological analysis has always been one of the most important parts of the investigation of a sudden cardiac death after the person has died, but at the same time, the increasing number of biochemical, imaging, and molecular methods makes it necessary to thoroughly assess the contribution of histology in such a multimodal forensic setting. This systematic review and meta-analysis aimed to assess the role of histopathology in the forensic diagnosis of sudden cardiac death and to compare its diagnostic performance with emerging adjunctive methods. Materials and Methods: A comprehensive literature search was carried out in main electronic databases to find studies related to histopathological examination and additional diagnostics in SCD. The study selection criteria allowed systematic review, meta-analysis, and original forensic investigation articles revealing the microscopic changes in the heart or the comparison of diagnostic results. The authors collected information on the type of studies, the histopathological findings, and the diagnostic performances. Where possible, the authors quantitatively summarized the data to determine the pooled effect of diagnostics and the comparative accuracy of the different modalities. Results: Eight publications consisting of nearly 1, 485 cases were examined. Histopathology played a major role in directly determining the cause of death in 72% of the cases and it was used as one of the proofs in another 18% of the cases. The top microscopic changes were myocardial ischemia/necrosis (41%), interstitial fibrosis (18%), cardiomyopathic remodeling (13%), myocarditis (11%), contraction band necrosis (9%), and vascular abnormalities (8%). Histopathology revealed a very high pooled sensitivity (88%) and specificity (92%) that even exceeded the performance of single adjunctive methods like cardiac biomarkers, postmortem imaging, and molecular analyses. A multimodal diagnostic method combining histopathology with ancillary techniques yielded the highest total diagnostic accuracy. Conclusion: Histopathological examination is still essential in the forensic diagnosis of sudden cardiac death. It is the main confirmatory method that discloses structural and cellular changes of the myocardium. Besides, newer biochemical, imaging, and molecular methods may increase the certainty of the diagnosis. However, they are most valuable when used together with traditional microscopy. Also, standardization of sampling protocols along with the application of advanced techniques might increase the diagnostic accuracy and decrease the number of unexplained sudden cardiac death cases in forensic medicine. Keywords: Sudden cardiac death, Forensic histopathology, Postmortem diagnosis, Myocardial injury, Molecular autopsy, Multimodal forensic investigation.
Page No: 417-423 | Full Text
Original Research Article
COMPARATIVE STUDY ON WATER, SANITATION AND HYGIENE (WASH) PRACTICES AMONG URBAN AND RURAL HOUSEHOLDS IN THE FIELD PRACTICE AREA OF TERTIARY CARE INSTITUTE
http://dx.doi.org/10.70034/ijmedph.2026.2.70
Kethana Poola, H. Hemachandra, M.K. Sasikala, Sunita. Sreegiri
View Abstract
Background: Safe WASH practices are important for the health and well-being of humans, contributes to healthy nutrition and helps to live in healthy environments. As per WHO, the burden of illness associated with WASH accounts for 4.0% of all fatalities and 5.7% of total disease burden across worldwide and 7.5% of all mortality in India. Appropriate WASH practices can prevent at least 9.1% of the global disease burden and 6.3% of all deaths by reducing the burden of water borne diseases, under-nutrition and contribute to economic development of the country. Development of infrastructure and proper regulations will improve the WASH practices to reduce the disease burden. This is study to assess and compare the water, sanitation and hygiene (WASH) practices among urban and rural households. Materials and Methods: Community based analytical cross-sectional study was conducted in the Urban and rural field practice area of Tertiary Care Institute, Tirupati during the period of July – September 2025. Using multistage random sampling, 170 households were selected. the data was collected regarding socio demographic details and WASH practices using “WHO Core questions on water, sanitation and hygiene for household surveys” by Interview method. Data was entered in MS Excel spread sheet and analysed using R software version 4.5.2. Results: Among the 170 households (85 - Urban and 85 – Rural), main source of drinking water in urban areas was piped water into dwelling (37.6%) whereas it was a water kiosk (70.6%) among rural households. Majority (46.8%) in urban areas had flush toilets to piped sewer system whereas in rural areas, majority (91.1%) had flush toilets to septic tank. Safe drinking water practices have been found to be significantly associated with locality (p = 0.002) using a Pearson chi-square test. Similarly, rural households (90.5%) had improved sanitation facilities compared to urban households (88.2%). Conclusion: A significantly higher proportion of rural households had access to improved drinking water sources and sanitation facilities compared to urban households. Overall, safe drinking water practices were strongly associated with locality, emphasizing the need for context-specific interventions to further enhance WASH standards in both settings. Keywords: Drinking water, Sanitation, Hygiene, WASH, Urban and rural households.
Page No: 424-428 | Full Text
Original Research Article
CYTOMORPHOLOGICAL PATTERN ANALYSIS OF VARIOUS LYMPHADENOPATHIES IN A TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.71
Hassan Sona Rai, Bhargavi Mohan, Manjunath HK, Mythri B M, Vinitra K, Gudrun Koul, Suma HV, Lakshmi Devi M
View Abstract
Background: The human body consists of approximately 600 lymph nodes. Tonsils, adenoids, spleen and Peyer’s patches are various parts of the lymphoid tissue and their role is to provide immunity against various pathogens. Peripheral lymph nodes are located deep in the subcutaneous tissue and can be palpated if any thing causes them to enlarge. Lymph node enlargements are one of the commonest clinical presentation of patients and it encompasses a wide spectrum ranging from inflammation to a malignant lymphoma and finally metastatic malignancy. FNAC needle aspiration cytology (FNAC) is a rapid, simple, reliable, minimally invasive and cost-effective procedure which can be used as a outpatient setting however, histopathological examination remains as the gold standard in the evaluation of lymphadenopathy. The aim of the study was to evaluate the cytomorphological patterns of various diseases causing lymphadenopathy. Materials and Methods: A three years study of 100 cases of lymphadenopathy presenting to the Department of pathology from 1stJanuary 2023 to 31stDecember 2025 taken up for our study. FNAC performed using a 23/24 gauge needle and 5ml syringe. Smears were fixed in 95% ethyl alcohol and stained with Hematoxylin and Eosin, Papanicolaou stain, Leishman stain and Giemsa stain. Special stains like periodic acid–Schiff for mucin (PAS) and Ziehl–Neelsen stain (ZN) stain for acid-fast bacilli (AFB) performed wherever required. Results: Most common lesion observed in our study was reactive lymphadenitis (25%), followed by granulomatous lymphadenitis (28.15%), tubercular lymphadenitis (17.39%) and metastatic lesions (14.23%). Cervical lymphadenopathy found to be the most common site in our study. During the course of 3 years a total of 100 cases were received and were studied. Age of patients ranged from 1 year to 78 years. Females were 32 and males were 68 in number. Most common lesion found in our study was Reactive hyperplasia (25%), followed by Granulomatous (22%), Tuberculosis (20%), Suppurative (10%), Necrotic (5%), Parasitic (1%), Metastasis (12%), Lymphoma (5%). Conclusion: Our study highlighted the various cytomorphological patterns of lymphadenopathy. This study demonstrates that fine needle aspiration is a safe accurate and valuable tool in the evaluation of various lymphadenopathies. FNAC analysis shows that accuracy is 95%, since histopathological diagnosis is invasive procedure. Keywords: Lymph node, Lymphadenopathy, Fine needle aspiration cytology, Granulomatous, Malignant.
Page No: 429-433 | Full Text
Original Research Article
STUDY OF SERUM TSH AND URIC ACID LEVELS IN PREECLAMPSIA OF PREGNANCY
http://dx.doi.org/10.70034/ijmedph.2026.16.2.72
Vishwajit Shivaji Shinde, Abhay Nagdeote, Sudhir Sase
View Abstract
Background: Preeclampsia - a major hypertensive disorder due to pregnancy is significantly associated with fetal and maternal morbidity. Alterations in thyroid function and purine metabolism have been implicated in its pathophysiology, reflected by changes in hormone serum thyroid-stimulating hormone and levels of serum uric acid. Objective: To estimate and compare serum TSH and levels of serum uric acid in women with pre-eclampsia and age-matched normal pregnant womens as controls, and to determine the correlation between these parameters. Materials and Methods: The study was conducted at a tertiary level hospital center including 61 pre eclamptic women and 61 normotensive pregnant controls. The serum TSH and serum uric acid levels were analyzed from venous blood samples collected of study participants. Urine samples were assessed for proteinuria. Statistical analysis included Chi-square test, unpaired t-test, correlating using Pearson’s correlation. Results: The cases and controls were comparable with respect to age (p > 0.05). Levels of serum uric acid were highly significant in cases of pre-eclamptic women than in controls, both in categorical distribution and mean values (5.48 ± 1.40 mg/dL vs.4.01±1.00 mg/dl; p<0.05). Although the categorical distribution of TSH levels did not significantly different between groups (p>0.05), Mean of serum TSH level was significantly higher in pre-eclamptic group (3.11± 0.98 µIU/mLvs.2.50 ± 1.06µIU/mL; p<0.05). Statistically significant but mild positive correlation was seen between serum TSH levels and levels of serum uric acid in cases group that is pre-eclamptic women (r = 0.30, p<0.05), while in the control group no significant correlation was found (p>0.05). Conclusion: Preeclampsia condition showed significantly elevated serum uric acid and mean TSH levels, along with a positive correlation between these parameters. Estimation of serum uric acid and TSH may serve as useful biochemical markers in the evaluation and monitoring of preeclampsia. Keywords: Pregnancy, Preeclampsia, TSH, Uric Acid, Complication in pregnancy, Eclampsia.
Page No: 434-438 | Full Text
Original Research Article
OUTCOMES OF LIMBERG FLAP RECONSTRUCTION IN PILONIDAL SINUS DISEASE: A PROSPECTIVE STUDY
http://dx.doi.org/10.70034/ijmedph.2026.16.2.73
Govardhan M. Gaikwad, Obaid Syed, Ameenuddin Ali Syed, Syed Naureen Nazar
View Abstract
Background: Pilonidal sinus disease (PSD) is a chronic inflammatory condition of the sacrococcygeal region associated with significant morbidity and recurrence. Off-midline closure techniques have demonstrated improved outcomes compared to conventional midline closure. Objective: To evaluate clinical outcomes, postoperative complications, and recurrence following Limberg flap reconstruction in patients with PSD. Methods: This prospective observational study was conducted at a tertiary care center in India from January 2021 to January 2026. A total of 56 adult patients with chronic or recurrent PSD underwent rhomboid excision with Limberg flap reconstruction. Postoperative outcomes, complications, and recurrence were assessed during follow-up. Results: Among 56 patients, 80.4% were male. Surgical site infection occurred in 5.4% of patients, seroma in 3.6%, and wound dehiscence in 1.8%. Transient paraesthesia was observed in 7.1% of patients. No flap necrosis was noted. No recurrence was observed among patients with available follow-up. Conclusion: Limberg flap reconstruction is a safe and effective technique for PSD, associated with low complication rates and favorable short to mid-term outcomes Keywords: Pilonidal sinus disease; Limberg flap; Rhomboid flap; Off-midline closure; Sacrococcygeal region.
Page No: 439-443 | Full Text
Original Research Article
HAND HYGIENE AWARENESS, PRACTICES, AND PERCEIVED BARRIERS AMONG OUTPATIENTS AND THEIR CAREGIVERS: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.16.2.74
Patil Sujata Rajendra, Salve Shobha Bansi, Andrea Almeida
View Abstract
Background: Hand hygiene remains a critical yet underutilized public health intervention for preventing infectious diseases and curbing antimicrobial resistance, especially in low-resource settings. Despite increased awareness during the COVID-19 pandemic, sustained behavioural change in community settings remains limited. This study assesses the attitudes and practices related to hand hygiene among patients in outpatient departments, and to identify common barriers across sociodemographic groups. Materials and Methods: A cross-sectional study was conducted among 1109 participants attending multiple centre OPDs of a tertiary care hospital’s field practice area. Data was collected using a structured interviewer administered pre-validated questionnaire through face-to-face interviews. Convenience sampling was used; descriptive and inferential statistics were applied to assess associations. Results: Most participants were from rural areas (72.1%). Awareness was highest for handwashing after toilet use (91.4%), but lower for after coughing/sneezing (33.6%) and caregiving (38.5%). Barriers included soap unavailability (49%), self-reported lack of awareness (38.4%), time constraints (32.6%), and lack of clean water (30.7%). Urban residents reported significantly higher consistent soap use than rural counterparts (p = 0.023). Positive attitudes were significantly associated with better hand hygiene practices (p < 0.00001). Conclusion: Despite good awareness of key hand hygiene moments, actual practices remain inconsistent, with significant rural-urban disparities and structural barriers. Interventions must go beyond information dissemination to address access, affordability, and behaviour change, especially in rural settings. Keywords: Hand hygiene, knowledge-attitude-practice (KAP), outpatient care, caregivers, barriers, rural health, infection prevention, India.
Page No: 444-449 | Full Text
Original Research Article
HPV DNA TESTING AS A PRIMARY TOOL FOR CERVICAL SCREENING IN THE COMMUNITY AND ITS CORRELATION WITH CERVICAL CYTOLOGY
http://dx.doi.org/10.70034/ijmedph.2026.2.75
Ganapuram Sindhuja, Archana Dinesh B, Banda Divya
View Abstract
Background: Cervical cancer remains one of the leading causes of cancer-related morbidity and mortality among women worldwide. Early detection of cervical epithelial abnormalities through screening methods such as the Papanicolaou (Pap) smear and Human Papillomavirus (HPV) DNA testing plays a crucial role in identifying precancerous lesions and preventing disease progression. Objectives: To evaluate the prevalence of cervical cytological abnormalities, determine HPV DNA positivity, and analyze the correlation between HPV DNA detection and Pap smear findings among women undergoing cervical cancer screening. Materials and Methods: This cross-sectional study included 286 women who underwent cervical screening. Data were analyzed with respect to demographic characteristics, parity, menstrual history, presenting complaints, per speculum examination findings, Pap smear results, and HPV DNA detection. Cytological findings were classified according to standard Pap smear reporting, and HPV DNA testing was performed to determine viral positivity. Statistical analysis was conducted to assess the association between cytological abnormalities and HPV DNA positivity. Results: Most participants were 41–45 years old, and multiparity (≥3 deliveries) was common. The most frequent symptom was white discharge per vagina, and 22% had an unhealthy cervix on examination. Pap smear showed normal cytology in 52.4%, while 47.6% had abnormalities, mainly inflammatory changes with few cases of ASCUS, LSIL, and HSIL. HPV DNA positivity strongly correlated with abnormal cytology, with all ASCUS, LSIL, and HSIL cases testing HPV positive. The association between abnormal Pap smear findings and HPV DNA positivity was highly statistically significant (χ² = 221.45, p < 0.0001). Conclusion: The study demonstrates a strong correlation between cervical cytological abnormalities and HPV DNA positivity. Combined screening with Pap smear and HPV DNA testing enhances the detection of premalignant cervical lesions and can improve early diagnosis and prevention strategies for cervical cancer. Keywords: Cervical cancer screening, Pap smear, Human Papillomavirus Deoxyribonucleic Acid(HPV DNA), Atypical Squamous Cells of Undetermined Significance(ASCUS), Low-Grade Squamous Intraepithelial Lesion(LSIL), High-Grade Squamous Intraepithelial Lesion(HSIL).
Page No: 450-454 | Full Text
Original Research Article
EVALUATION OF MODIFIED TENSION BAND TECHNIQUE IN THE MANAGEMENT OF PATELLAR FRACTURES
http://dx.doi.org/10.70034/ijmedph.2026.2.76
S Khaderulla Basha, Uma Maheshwar Reddy, T Naveen Babu, Madamanchi Harsha
View Abstract
Background: Patella fractures are common and it constitutes about 1% of all skeletal injuries resulting from either direct or indirect trauma. The subcutaneous location of the patella makes it vulnerable to direct trauma as in dashboard injuries or a fall on the flexed knee, whereas violent contraction of the quadriceps results in indirect fractures of patella. These fractures are usually transverse and are associated with tears of medial or lateral retinacular expansions. In this study a series of 30 cases of fracture patella were studied after treating with Modified Tension Band Wiring technique. Material and Methods: This prospective study was done in Department of Orthopaedics at Sri Balaji Medical College Hospital & Research Institute, Tirupati, who were enrolled between July 2024 - June 2025.This study consists of 30 cases of fracture patella treated by modified tension band wiring. Cases were selected based on inclusion and exclusion criteria. Inclusion Criteria: 1. All closed and type I and type II open displaced transverse patellar fractures. 2. Transverse fracture with displacement of more than 2 to 3 mm and articular step of more than 2mm. 3. Comminuted fractures where reconstruction and fixation by modified tension band wiring is possible. Exclusion Criteria: 1. Type III compound fractures. 2. Grossly comminuted, vertical or marginal fractures. 3. Old fractures (more than 2-3 weeks). 4. Pathological fractures. Conclusion: Our study shows that modified tension band wiring is a definitive procedure in management of displaced transverse patellar fracture with least complications and also helps for early mobilization post-operatively. In our study we observed excellent result in 86.6% and good in about 10% and poor in 3.3% of cases. 4 Out of 30 cases had complications. Early post-operative Physiotherapy is a very essential tool of success in the management of these fractures, which helps in reducing complication like stiffness of knee and in providing good function. Long-term follow up is necessary to assess late complications like osteoarthritis and late functional outcome. Keywords: Patellar fractures, modified tension band technique, West’s criteria
Page No: 455-458 | Full Text
Original Research Article
A STUDY TO COMPARE ONDANSETRON AND DEXAMETHASONE FOR PREVENTION OF INTRAOPERATIVE NAUSEA AND VOMITING DURING CESAREAN DELIVERY UNDER SUBARACHNOID BLOCK
http://dx.doi.org/10.70034/ijmedph.2026.2.77
Sherin Soni, Vignesh Rajan, Urmila Keshari, Anuj Keshari, K Gopiraju
View Abstract
Background: Intraoperative nausea and vomiting (IONV) are frequent and distressing complications during cesarean delivery performed under subarachnoid block. These symptoms can compromise maternal comfort, surgical conditions, and overall patient satisfaction. Effective prophylaxis is therefore essential. Ondansetron and dexamethasone are commonly used antiemetics, but comparative evidence during cesarean delivery under spinal anesthesia remains limited. Materials and Methods: This prospective, randomized, double-blind, placebo-controlled study was conducted on 100 term parturients undergoing elective cesarean delivery under spinal anesthesia. Participants were randomly allocated into three groups: placebo (normal saline), dexamethasone 8 mg IV, or ondansetron 4 mg IV administered 20 minutes prior to spinal anesthesia. The primary outcomes were the incidence and severity of nausea, retching, and vomiting. Secondary outcomes included hemodynamic changes, ephedrine requirement, sedation scores, need for rescue antiemetics, and neonatal APGAR scores. Results: Ondansetron significantly reduced the incidence of nausea (18.2%), retching (9.1%), and vomiting (3.0%) compared with dexamethasone and placebo (p < 0.05). Dexamethasone demonstrated moderate efficacy compared with placebo but was less effective than ondansetron. The severity of nausea and requirement for rescue antiemetics were lowest in the ondansetron group. Hemodynamic parameters, sedation scores, and neonatal APGAR scores at 1 and 5 minutes were comparable across all groups. Conclusion: Ondansetron 4 mg IV is superior to dexamethasone 8 mg IV and placebo for preventing intraoperative nausea and vomiting during cesarean delivery under spinal anesthesia. Dexamethasone offers moderate benefit and may serve as an alternative or adjunct. Both agents are safe for maternal and neonatal outcomes. Keywords: Intraoperative Nausea and Vomiting, Ondansetron, Dexamethasone, Cesarean Section, Spinal Anesthesia.
Page No: 459-462 | Full Text
Original Research Article
HEARING LOSS IN CHRONIC KIDNEY DISEASE PATIENTS ON DIALYSIS: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.78
Jubina Puthen Purayil, Kalpana RajivKumar, Swapnil Gosavi, Ketki Pimpalkhute
View Abstract
Background: Chronic kidney disease (CKD) is a progressive systemic disorder that affects multiple organs, including the auditory system. Structural and physiological similarities between the cochlea and the kidney may predispose patients with renal dysfunction to hearing impairment. Objectives: The present study aimed to determine the prevalence and severity of hearing loss among CKD patients undergoing dialysis and to evaluate its association with serum creatinine levels and duration of dialysis. Materials and Methods: A cross-sectional observational study was conducted among 50 patients diagnosed with CKD and receiving maintenance dialysis. Audiological assessment was performed and hearing loss was categorized as none, mild, moderate, or severe. Serum creatinine levels and duration of dialysis were recorded for each patient. Statistical analysis included descriptive statistics, Chi-square test for association, and Pearson correlation to determine the relationship between creatinine levels and severity of hearing loss. A p-value of <0.05 was considered statistically significant. Results: Out of 50 patients, 29 (58%) demonstrated hearing loss. Moderate to severe hearing impairment was observed in 21 patients (42%). A statistically significant association was identified between duration of dialysis and severity of hearing loss (p = 0.03). Serum creatinine levels showed a moderate positive correlation with hearing loss severity (r = 0.62), indicating that higher creatinine levels were associated with greater auditory impairment. Conclusion: Hearing loss is a common complication among CKD patients undergoing dialysis. Both elevated serum creatinine levels and longer duration of dialysis are associated with increased severity of hearing impairment. Incorporating routine audiological screening into CKD management may facilitate early detection and timely intervention. Keywords: Chronic kidney disease, Sensorineural hearing loss, Dialysis, Creatinine, Audiometry.
Page No: 463-466 | Full Text
Original Research Article
EMERGING TRENDS IN URINARY CANDIDIASIS: SPECIES SPECTRUM AND AZOLE RESISTANCE IN A CROSS-SECTIONAL STUDY AT TERTIARY CARE HOSPITAL, TAMILNADU
http://dx.doi.org/10.70034/ijmedph.2026.2.79
Ravikumar S, Ravichandran B, Sathiya M, Aarthy S
View Abstract
Background: Candidiasis is the most common cause of fungal infections, leading to a range of muco-cutaneous infections to life threatening invasive diseases. Opportunistic infections by Candida sp are becoming quite common in hospitals today with antifungal resistance. Candida species account for almost 10-15% nosocomial UTIs. Candida albicans has been the commonest species causing infection for many years but indiscriminate use of azole group of drugs has led to increase in Non Albicans Candida infection and resistance to antifungal drugs in Candida species. Aim: To determine the isolation pattern, species distribution and antifungal susceptibility pattern of Candida species in urine samples particularly, their azole resistance pattern in the department of Microbiology, Chengalpattu Medical College and Hospital. Materials and Methods: From August 2022 – July 2023, from a total of 644 urine samples,71 Candida species were isolated, speciated by various phenotypic methods and antifungal susceptibility testing done under standard mycological procedures. Results: In this study, predominanace of Non Albicans Candida species is evident, with C.tropicalis (34%) as the predominant species followed by C.albicans (23%) .Fluconazole resistant found to be 37% in this study which is higher due to prevalance of Non Albicans Candida species. Conclusion: This study highlights the predominance of Non-albicans Candida (NAC) species, being an important cause of urinary tract infections. Non -albicans Candida (NAC) species are more resistant to antifungal drugs than Candida albicans owing to higher prevalance of azole resistance. Therefore, the species identification of Candida isolates along with their antifungal susceptibility pattern can help the clinician in better treatment of patients with candiduria. Due to appropriate management with antifungal agents, there will be significant decrease in the length of stay of hospitalized patients, who may be already immunocompromised, thereby reducing the chance of acquiring nosocomial infections and healthcare costs. Keywords: Susceptibility, candidiasis, antifungal resistance, immunocompromised.
Page No: 467-472 | Full Text
Original Research Article
MORPHOMETRIC ANALYSIS AND CONGENITAL ANOMALIES OF KIDNEY AND URETER IN ADULT CADAVERS: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.80
Vanajakshi Bothsa, K Vijayalakshmi
View Abstract
Background: Measurement of morphometric parameters of the kidney and ureter plays a crucial role in clinical practice, as the variations existing can influence clinical diagnosis and surgical procedure Materials and Methods: A descriptive cross- sectional study was conducted in the Department of Anatomy at a tertiary medical institute,over the period of five years from February 2020 till March 2025. Total of 50 pairs of adult cadaveric kidneys and ureters were examined. Morphometric parameters including renal length, width, thickness, weight, and ureteric length were measured using vernier callipers and a digital weighing machine. Statistical analysis was performed using IBM SPSS version 27. Results: The left kidney consistently demonstrated greater length, width, thickness, and weight compared to the right kidney (p < 0.0001). No statistically significant difference was observed in ureteric length between sides (p = 0.949). Pearson’s correlation analysis showed a weak positive correlation between renal length and ureteric length on both sides, which was not statistically significant. Congenital anomalies were observed in 14% of cadavers. Conclusion: The study provides baseline morphometric data of kidneys and ureters in adult cadavers and highlights the presence of congenital anomalies that could have clinical significance in surgical and radiological practice. Keywords: Kidney morphometry, Cadaveric study, Congenital renal anomalies, ureter morphometry.
Page No: 473-478 | Full Text
Case Report
ACUTE BASAL GANGLIA HEMORRHAGE AS THE INITIAL PRESENTATION OF EXTRA-ADRENAL PARAGANGLIOMA IN AN ADOLESCENT: A CASE REPORT
http://dx.doi.org/10.70034/ijmedph.2026.2.81
Vivek Kumar Tripathi, Mahim Mittal
View Abstract
Background: Intracerebral hemorrhage (ICH) in adolescents is uncommon and often indicates an underlying secondary etiology. Among these, catecholamine-secreting tumors such as paragangliomas are rare but potentially life-threatening causes of severe hypertension and vascular complications. Case Presentation: We report the case of a 15-year-old male who presented with sudden onset left-sided hemiparesis and slurred speech. On admission, he was found to have severe hypertension (200/110 mmHg). Neuroimaging revealed a right basal ganglia hemorrhage. Further evaluation uncovered a history of episodic headache, palpitations, sweating, and abdominal pain suggestive of catecholamine excess. Abdominal imaging demonstrated a well-defined enhancing mass in the aortocaval region adjacent to the right adrenal gland. Biochemical analysis showed significantly elevated urinary metanephrine levels, confirming the diagnosis of an extra-adrenal paraganglioma. Management and Outcome: The patient was managed conservatively for intracerebral hemorrhage with blood pressure control using intravenous labetalol and osmotherapy. Rehabilitation therapy was initiated, and the patient was subsequently referred for definitive surgical management of the tumor. Conclusion: This case highlights the importance of considering paraganglioma as a rare but critical cause of hypertensive intracerebral hemorrhage in adolescents. Early recognition through clinical suspicion, biochemical testing, and imaging is essential to prevent morbidity and mortality. Keywords: Intracerebral hemorrhage; basal ganglia; paraganglioma; adolescent hypertension; catecholamine excess; stroke in young.
Page No: 479-483 | Full Text
Original Research Article
A STUDY OF ASSOCIATION OF CLINICAL SEVERITY OF ACUTE ISCHEMIC STROKE WITH SERUM CALCIUM LEVELS AT THE TIME OF ADMISSION: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.82
Nadgatti Vinayak Umesh, Rahul Arya, Mukesh Kumar
View Abstract
Background: Acute ischemic stroke is a leading cause of mortality and disability worldwide. Dysregulation of calcium homeostasis is a key mechanism in the ischemic injury. This study aimed to investigate whether serum calcium levels at the time of hospital admission correlate with the clinical severity of acute ischemic stroke as assessed by the National Institutes of Health Stroke Scale (NIHSS). Materials and Methods: A cross-sectional observational study was conducted at T.S. Misra Medical College & Hospital, Lucknow, enrolling 50 consecutive patients diagnosed with acute ischemic stroke by CT imaging, presenting within 24 hours of symptom onset. Serum total calcium, albumin-corrected calcium, and ionized calcium were measured at admission. Stroke severity was assessed using NIHSS at admission, day 4, and discharge. Patients were classified as mild (NIHSS 1–4), moderate (NIHSS 5–15), or severe (NIHSS 16–42). Infarct size was categorized on CT imaging. Statistical correlation and group comparisons were performed across severity strata and discharge status. Results: The mean age was 61.66 ± 13.88 years, with a male predominance of 70%. Mean serum total calcium was 9.09 ± 0.90 mg/dl; corrected calcium was 9.28 ± 0.98 mg/dl; and ionized calcium was 1.15 ± 0.12 mmol/L. At admission, 68% had moderate strokes. A statistically significant inverse relationship was found between total serum calcium and stroke severity (p = 0.035), with mild stroke patients having the highest calcium (9.45 ± 0.61 mg/dl) and severely affected patients the lowest (8.80 ± 0.85 mg/dl). Ionized calcium also correlated significantly with stroke severity (p = 0.044). Patients who deteriorated clinically had significantly lower total calcium (7.84 ± 0.58 mg/dl) compared to those who improved (9.44 ± 0.64 mg/dl; p < 0.001). The NIHSS improved from 7.16 at admission to 5.82 at discharge. No significant association was found between ionized calcium and infarct size (p = 0.565). Conclusion: Lower admission serum calcium was significantly associated with greater stroke severity and poorer short-term outcomes. Serum total calcium may serve as a simple, cost-effective, and widely available prognostic biomarker in acute ischemic stroke. Larger prospective studies are needed. Keywords: Acute ischemic stroke, serum calcium, corrected calcium, NIHSS score, stroke severity, infarct size, prognostic biomarker.
Page No: 484-489 | Full Text
Original Research Article
CLINICAL CHARACTERISTICS AND OUTCOMES OF ROAD TRAFFIC INJURY PATIENTS AT A TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.83
Chirag Dineshbhai Amin, Dhruvesh Laljibhai Katara, Meet Pravinbhai Prajapati, Ashish Chaudhary, Bhargav Valangar
View Abstract
Background: Road traffic injuries are a major cause of morbidity and mortality worldwide, particularly affecting young adults. Objective: To assess the characteristics and outcomes of non-polytraumatized road traffic injury patients presenting to the emergency department. Materials and Methods: A prospective observational study was conducted on 150 patients with road traffic injuries. Demographic details, injury characteristics, and outcomes were recorded and analyzed. Results: Males constituted 68.0% of cases, with peak incidence in the 31–40 years age group (22.7%). Commercial vehicles were the most common mode of transportation involved. Head injuries (30.7%) and fractures (20.0%) were the most frequent injury types. Higher admission rates were observed in truck-related injuries (80.0%), while mortality was higher in motorcycle-related cases (12.0%). Conclusion: Road traffic injuries predominantly affect young adult males and vary in outcome depending on vehicle type and injury pattern. Early intervention and preventive strategies are essential to improve outcomes. Keywords: Road traffic injury, Emergency department, Trauma, Outcome analysis.
Page No: 490-493 | Full Text
Original Research Article
ASSESSMENT OF CLINICORADIOLOGICAL PATTERN AND ETIOLOGICAL FACTORS AMONG PATIENTS WITH BRONCHIECTASIS
http://dx.doi.org/10.70034/ijmedph.2026.16.2.84
Jatin Arya, Aashutosh Asati, P. K. Baghel, Pramod Kushwaha, Sudeept Kumar Dwivedi
View Abstract
Background: Bronchiectasis is a chronic suppurative lung disease characterized by irreversible bronchial dilatation, recurrent infections, and progressive respiratory impairment. High-resolution computed tomography (HRCT) has emerged as the gold standard for diagnosis, enabling detailed assessment of disease extent and patterns. Identifying clinicoradiological correlations and etiological factors is crucial for targeted management and prognostication. Objectives: To assess the clinical presentation, radiological patterns on HRCT chest, and etiological factors among patients with bronchiectasis and to evaluate their association with disease severity. Materials and Methods: This observational cross-sectional study was conducted over a period of one year at Shyam Shah Medical College and Sanjay Gandhi Memorial Hospital, Rewa, Madhya Pradesh, included 100 diagnosed cases of bronchiectasis. Detailed clinical evaluation, sputum microbiology, pulmonary function testing, and HRCT chest were performed. Radiological patterns (cylindrical, varicose, cystic, mixed), lobar distribution, and associated findings were analyzed. Etiology was categorized as post-infective, tuberculosis-related, idiopathic, congenital. Results: The mean age of patients was 60.05 ± 11.64 years, with a female predominance (55%). The most common symptoms were chronic cough (92%), expectoration (52%), and recurrent hemoptysis (27%). HRCT revealed varicose bronchiectasis as the predominant pattern (34%), followed by mixed (26%), cylindrical and cystic (20%) types. Varicose bronchiectasis was the most common HRCT pattern in both smokers and non-smokers. FACED score a total of 45% of patients had a score between 0–2, 28% had scores of 3–4, 27% had scores ranging from 5–7, representing severe disease with poorer outcomes. Post-tubercular bronchiectasis (30%), followed by COPD Associated (28%), idiopathic (25%) and ABPA was (17%). Patients with cystic and mixed patterns showed significantly lower FEV1 values and higher frequency of Pseudomonas aeruginosa isolation (p < 0.01). There was a significant negative correlation between oxygen saturation (SpO₂) and FACED score across all severity categories. Conclusion: Bronchiectasis predominantly affects middle-aged males and commonly presents with chronic productive cough. Cylindrical bronchiectasis with lower lobe predominance is the most frequent HRCT pattern. Post-infective and post-tubercular etiologies remain leading causes. HRCT patterns correlate significantly with clinical severity and microbiological profile, emphasizing its role in comprehensive disease assessment and management. Key Words: Bronchiectasis; High-Resolution Computed Tomography; Clinicoradiological Correlation; Etiological Factors; Post-infective Lung Disease.
Page No: 494-500 | Full Text
Original Research Article
A STUDY ON CAN (CLINICAL ASSESSMENT OF NUTRITION) SCORE AND ITS CORRELATION WITH GESTATIONAL AGE
http://dx.doi.org/10.70034/ijmedph.2026.2.85
Suma Priya M, Sandupatla Kiran Kumar, Pogula Arun Reddy, B. Venkatachalam, Mettu Pradeep Reddy
View Abstract
Background: The primary objective of the study was to assess the nutritional status of newborn babies using clinical assessment of nutritional score (CAN SCORE) within 48 hours after birth and categorise them into nourished and malnourished. The secondary objective was to find the association of CAN SCORE with (1) neonatal clinical parameters. Materials and Methods: A comparative, hospital-based cross-sectional observational study was done on 200 newborn babies delivered in Mallareddy Medical College for Women and Hospital. The study was done for 1 year, from December 2020 to December 2021. The newborn parameters used to assess the nutritional status were CAN SCORE and Anthropometric measurements. The analysis used the Special Package for Social Sciences (SPSS), and a p-value of < 0.05 was considered significant. Results: The nutritional status of the newborns was assessed based on CAN SCORE, which showed that 127 (63%) babies had fetal malnutrition, with CANSCORE <25 and 73 (37%) being well nourished with CANSCORE >25.The association of CAN SCORE with different parameters was evaluated to reach a result. 70% of late pre-term and 65% of term babies were malnourished, proving a statistically positive association.63.5% of the newborns were malnourished whereas weight for gestation age identified 38% as SGA and 62% as AGA with a p-value of 0.017. The association between neonatal BMI and CAN SCORE was statistically significant, with a p-value of 0.016 (<0.05). The association between Ponderal Index and CAN SCORE was statistically significant, with a p-value of 0.008 (<0.05). Conclusion: The CANSCORE has a statistically significant association with gestational age, birth weight according to gestational age, neonatal BMI, ponderal index and maternal anaemia status. Keywords: Nutritional Score, Malnourished, New Born babies.
Page No: 501-506 | Full Text
Original Research Article
IMMUNOHISTOCHEMICAL EVALUATION OF P63 EXPRESSION IN THE GRADING OF UROTHELIAL NEOPLASMS
http://dx.doi.org/10.70034/ijmedph.2026.2.86
Suganthi P, Dhivya M, J. Maheswari
View Abstract
Background: Urothelial carcinoma (UC) is the most common malignancy of the urinary bladder and demonstrates wide morphological and biological variability. Interobserver variability is still a problem, despite the fact that tumor grading is essential for prognosis and treatment choices. It has been suggested that p63, a nuclear transcription factor involved in epithelial development, might be used as an additional marker to grade urothelial neoplasms. The aim is to evaluate the immunohistochemical expression of p63 in varying grades of urothelial neoplasms and to assess its correlation with histopathological grade and clinicopathological parameters. Materials and Methods: This laboratory-based prospective and retrospective observational study included 50 cases of primary urothelial carcinoma diagnosed between July 2015 and June 2019 at the Department of Pathology, Dhanalakshmi Srinivasan Medical College and Hospital. Formalin-fixed paraffin-embedded tissue sections were subjected to immunohistochemical staining using mouse monoclonal antibody against p63 (clone 4A4). Nuclear staining in more than 10% of tumor cells was considered increased expression, while less than 10% was considered decreased expression. Statistical analysis was performed using SPSS version 21.0, and Chi-square test was applied with p < 0.05 considered significant. Results: High-grade urothelial carcinoma constituted 62% of cases, while 38% were low grade. Increased p63 expression was predominantly observed in low-grade tumors (64.3%), whereas decreased expression was mainly associated with high-grade tumors (95.5%). A statistically significant inverse correlation was found between p63 expression and tumor grade (χ² = 18.662, p = 0.000). No significant association was observed between p63 expression and age or tumor stage. Conclusion: p63 expression shows a significant inverse correlation with histological grade of urothelial carcinoma. Decreased expression is associated with high-grade tumors, suggesting its utility as an adjunct immunohistochemical marker in grading urothelial neoplasms. Larger studies with long-term follow-up are recommended to further establish its prognostic significance. Keywords: Urothelial carcinoma; p63; Immunohistochemistry; Histological grading; Bladder cancer; Tumor differentiation; WHO/ISUP classification; Muscle invasion; Prognostic marker.
Page No: 507-513 | Full Text
Original Research Article
COMPARISON OF INSERTION CHARACTERISTICS OF LMA PROSEAL AND AMBU AURAGAIN IN ADULT PATIENTS UNDER CONTROLLED VENTILATION: A RANDOMISED STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.87
Jairam Panicker, Mathews. P. Oommen
View Abstract
Background: Supraglottic airway devices are widely used for airway management in patients undergoing surgery under general anaesthesia. The LMA ProSeal and Ambu AuraGain are second-generation supraglottic airway devices designed to provide improved airway seal and facilitate gastric access. Although both devices are commonly used in clinical practice, their insertion characteristics and performance under controlled ventilation require comparative evaluation. The objective is to compare the insertion characteristics of LMA ProSeal and Ambu AuraGain in adult patients undergoing surgery under general anaesthesia with controlled ventilation, with respect to ease of insertion, number of attempts, time taken for successful insertion, oropharyngeal leak pressure, and perioperative complications. Materials and Methods: This prospective, randomized study was conducted over a period of 12 months in a tertiary care hospital. A total of 100 adult patients, aged 18–60 years and belonging to ASA physical status I and II, scheduled for elective surgeries under general anaesthesia with controlled ventilation were enrolled. Patients were randomly allocated into two groups of 50 each: Group P (LMA ProSeal) and Group A (Ambu AuraGain). The primary outcomes assessed were number of insertion attempts, time taken for successful device insertion, and ease of insertion. Secondary outcomes included oropharyngeal leak pressure, adequacy of ventilation, and incidence of perioperative complications such as sore throat, blood staining of the device, and airway trauma. Results: Both devices were successfully inserted in the majority of patients. The Ambu AuraGain demonstrated a higher first-attempt success rate and shorter insertion time compared to the LMA ProSeal. Oropharyngeal leak pressure was comparable between the two groups. The incidence of minor complications was low and did not differ significantly between the groups. Conclusion: Both LMA ProSeal and Ambu AuraGain are effective and safe supraglottic airway devices for use in adult patients under controlled ventilation. However, Ambu AuraGain appears to offer advantages in terms of ease and speed of insertion, with comparable airway seal and complication profile. Keywords: LMA ProSeal, Ambu AuraGain, supraglottic airway device, controlled ventilation, randomized study, airway management.
Page No: 514-519 | Full Text
Original Research Article
COMPARISON OF MEDIAN AND PARAMEDIAN TECHNIQUE OF THORACIC EPIDURAL ANAESTHESIA IN PATIENTS UNDERGOING LAPAROTOMY UNDER COMBINED GENERAL AND EPIDURAL ANAESTHESIA: A PROSPECTIVE OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.88
Mathews. P. Oommen, Jairam Panicker
View Abstract
Background: Thoracic epidural anaesthesia, when combined with general anaesthesia, is widely used for perioperative analgesia in patients undergoing laparotomy. The two commonly employed approaches for epidural space identification are the median and paramedian techniques. Although both techniques are routinely practiced, they differ in anatomical approach, technical ease, and potential complication profile. Comparative data on their performance in the thoracic region, particularly in the context of combined general and epidural anaesthesia, remain limited. The objective is to compare the median and paramedian techniques of thoracic epidural catheter placement in patients undergoing laparotomy with respect to technical success, number of attempts, ease of catheter placement, procedure-related complications, and quality of perioperative analgesia. Materials and Methods: This prospective observational study included 120 adult patients scheduled for elective laparotomy under combined general and thoracic epidural anaesthesia. Patients underwent thoracic epidural catheter placement using either the median or paramedian approach based on the attending anaesthesiologist’s routine practice. Data collected included demographic variables, number of attempts required for successful epidural placement, time taken for the procedure, incidence of complications such as vascular puncture, dural puncture, and paraesthesia, and intraoperative as well as postoperative analgesic efficacy. Postoperative pain scores and requirement of rescue analgesia were recorded to assess the quality of analgesia. The two groups were compared using appropriate statistical tests. Results: Both techniques were found to be effective for thoracic epidural placement. The paramedian approach was associated with a higher first-attempt success rate and fewer needle redirections, whereas the median approach required fewer anatomical landmarks to be negotiated. Procedure-related complications were infrequent in both groups. Postoperative analgesia, as assessed by pain scores and rescue analgesic requirements, was comparable between the two techniques. Conclusion: Both median and paramedian approaches for thoracic epidural anaesthesia are safe and effective in patients undergoing laparotomy under combined general and epidural anaesthesia. The paramedian approach may offer technical advantages in terms of ease of placement and first-attempt success, while analgesic outcomes remain comparable between the two techniques. Keywords: Thoracic epidural anaesthesia, Median approach, Paramedian approach, Laparotomy, Combined general and epidural anaesthesia, Postoperative analgesia.
Page No: 520-526 | Full Text
Original Research Article
COMPARISON OF DEXAMETHASONE AND KETOROLAC AS AN ADJUNCT TO 0.5% (H) LEVOBUPIVACAINE IN ULTRASOUND AND PNS GUIDED AXILLARY BRACHIAL PLEXUS BLOCK IN PATIENTS UNDERGOING HAND AND FOREARM SURGERY
http://dx.doi.org/10.70034/ijmedph.2026.2.89
Shraddha More, Priya Ragini Prasad, Charu Neema
View Abstract
Background: Regional anesthesia has become an essential component of modern anesthetic practice, particularly in upper limb surgeries where it provides excellent analgesia, muscle relaxation, reduced opioid consumption, and early mobilization with fewer adverse effects compared to general anesthesia. Among the various regional anesthesia techniques, the axillary brachial plexus block is widely used for hand and forearm surgeries due to its safety, effectiveness, and ease of administration. The objective is to compare dexamethasone and ketorolac as an adjunct to 0.5% hyperbaric levobupivacaine in ultrasound and peripheral nerve stimulator guided axillary brachial plexus block in patients undergoing hand and forearm surgery. Materials and Methods: This prospective randomized comparative study was conducted in patients undergoing hand and forearm surgeries under axillary brachial plexus block. Patients were divided into two groups receiving levobupivacaine with dexamethasone or levobupivacaine with ketorolac. Sensory block, motor block, duration of analgesia, hemodynamic parameters, and rescue analgesia were recorded and analyzed statistically. Results: The onset of sensory and motor block was faster in the dexamethasone group compared to the ketorolac group. The duration of sensory and motor block was significantly prolonged in the dexamethasone group. Duration of postoperative analgesia was longer in patients who received dexamethasone as an adjuvant compared to ketorolac. Hemodynamic parameters remained stable in both groups throughout the study period. The number of rescue analgesic doses required was lower in the dexamethasone group compared to the ketorolac group. No significant adverse effects were observed in either group. Overall, dexamethasone provided better block characteristics and longer postoperative analgesia compared to ketorolac. Conclusion: Dexamethasone and ketorolac are effective adjuvants when added to levobupivacaine for axillary brachial plexus block. However, dexamethasone provides faster onset of block, longer duration of sensory and motor block, and prolonged postoperative analgesia compared to ketorolac. The addition of dexamethasone to levobupivacaine improves the quality of block and reduces postoperative analgesic requirement. Keywords: Axillary Brachial Plexus Block, Levobupivacaine, Dexamethasone, Ketorolac
Page No: 527-534 | Full Text
Original Research Article
ASSESSMENT OF COMPLIANCE WITH THE CIGARETTES AND OTHER TOBACCO PRODUCTS ACT (COTPA), 2003 IN PUBLIC PLACES OF SOUTH ANDAMAN DISTRICT, INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.90
Amrita Burma, N Haney Nickyson, Deepak Kumar, Anagha J
View Abstract
Background: The Cigarettes and Other Tobacco Products Act (COTPA), 2003 is a key legislative tool in India to curb tobacco use. However, its enforcement remains inconsistent. This study assessed the compliance of COTPA sections 4, 5, 6, 7, and 8 in public places and evaluated awareness among tobacco vendors in South Andaman district. Materials and Methods: A cross-sectional observational study was conducted over two months (September–October). Using two-stage cluster sampling, 106 public places (accommodation facilities, educational institutions, health facilities, public transport, shops) across three regions (North, Central, South) of South Andaman were observed. Additionally, 30 tobacco vendors were interviewed using a structured questionnaire. Data were analyzed using descriptive statistics and chi-square tests. Results: Only 23.3% of vendors were aware of COTPA. Overall compliance was poor: smoking in public places (Section 4) was observed in 41.5% of sites; 93.4% lacked mandatory signboards; 43.4% sold tobacco to minors (Section 6a); 27.4% sold within 100 yards of educational institutions (Section 6b). While 89.6% of tobacco packets displayed pictorial health warnings (Section 7), 93.4% sold loose cigarettes, violating the ban. Compliance with specified font and color for warnings (Section 8) was 83%. Regional variations existed, with poorer compliance in rural southern and northern areas for sales to minors, and urban central areas for advertising and packaging violations. Conclusion: COTPA compliance in South Andaman is inadequate, reflecting poor awareness and weak enforcement. Strengthened inter-sectoral coordination, regular monitoring, stricter penalties, and targeted public awareness campaigns are urgently needed. Keywords: COTPA, tobacco control, compliance, public places, Andaman and Nicobar Islands, tobacco vendors, enforcement.
Page No: 535-542 | Full Text
Original Research Article
A STUDY OF ROLE OF PLATELET TO LYMPHOCYTE RATIO [PLR] AND ITS CORRELATION WITH NATIONAL INSTITUTE OF HEALTH STROKE SCALE [NIHSS] FOR PREDICTION OF SEVERITY IN PATIENTS OF ACUTE ISCHEMIC STROKE
http://dx.doi.org/10.70034/ijmedph.2026.2.91
Srinivasa J, Umesh G Rajoor, Mohd Naveed Khan
View Abstract
Background: Despite of advances in clinical management, there is no robustic prognostic marker in acute stroke. The Platelet-to-Lymphocyte Ratio (PLR) is considered to be important marker for predicting stroke severity and its outcomes. This study aims to find out correlation between PLR and the National Institute of Health Stroke Scale (NIHSS) for predicting stroke severity and prognosis. Materials and Methods: This prospective observational study was conducted at Koppal Institute of Medical Sciences, Koppal, over one year (May 2022 to April 2023). Fifty patients with acute ischemic stroke, presenting within seven days of symptom onset, were enrolled. Detailed clinical and laboratory data, including platelet and lymphocyte counts, were collected at admission, at day3 /at time of discharge. Stroke severity was assessed using the NIHSS at both time points. Statistical analysis was performed using SPSS version 21, with Pearson's correlation coefficient used to analyze the relationship between PLR and NIHSS. Results: The study found a significant positive correlation between PLR and NIHSS at both admission (r = 0.874, p < 0.001) and day3 /discharge (r = 0.907, p < 0.001). Changes in PLR from admission to discharge were also strongly correlated with changes in NIHSS (r = 0.938, p < 0.001). These findings suggest that higher PLR values are associated with greater stroke severity and that changes in PLR reflect changes in stroke severity. Conclusion: PLR is a simple inflammatory marker for predicting stroke severity and helps in prognosis of patients with acute ischemic stroke. The significant correlations of PLR with NIHSS at both admission and day3 /discharge underscore the potential of PLR to enhance clinical assessment and guide therapeutic decisions. Further research is required to validate these findings in larger, multi-centre cohorts and to find out the mechanisms of the relationship between PLR and stroke outcomes. Keywords: Acute ischemic stroke, Platelet to Lymphocyte Ratio, National Institute of Health Stroke Scale, Stroke severity.
Page No: 543-547 | Full Text
Original Research Article
PATTERN OF LYMPH NODE LESIONS CLASSIFIED BY THE SYDNEY SYSTEM: A TERTIARY CARE CENTER EXPERIENCE IN SOUTHERN TELANGANA
http://dx.doi.org/10.70034/ijmedph.2026.2.92
Pavuluri Divya, Siva Chaithanya Bangi, Anusha Priyadarshani, Rama Devi
View Abstract
Background: Fine needle aspiration cytology (FNAC) is a rapid, minimally invasive and cost effective diagnostic modality for the evaluation of lymphadenopathy. Historically, the absence of a standardized reporting system resulted in variability in interpretation and communication. The Sydney Reporting System was introduced at the 20th International Congress of Cytology held in Sydney in 2019 to provide a uniform framework for lymph node FNAC reporting. Materials and Methods: This Prospective study was conducted in the Department of Pathology, Government Medical College, Wanaparthy, from 2023 to 2025. All lymph node FNAC cases received during the study period were reviewed and categorized according to the Sydney Reporting System. Demographic details, lymph node site, and cytological diagnosis were analyzed. Results: A total number of 90 lymph node FNAC cases were evaluated. Benign lesions constituted the majority, with reactive lymphadenitis being the most common diagnosis, followed by granulomatous and suppurative lymphadenitis. Malignant lesions formed a smaller proportion of cases. Cervical lymph nodes were the most frequently involved site. Conclusion: The Sydney Reporting System is a practical and reproducible framework for standardized lymph node FNAC reporting. Its application improves diagnostic clarity and facilitates effective clinician–pathologist communication, even in FNAC only settings. Keywords: FNAC, Lymph node, Sydney system, Cytology.
Page No: 548-550 | Full Text
Systematic Review
CLINICAL PROFILE AND MANAGEMENT OUTCOMES OF EUGLYCEMIC DIABETIC KETOACIDOSIS: A SYSTEMATIC REVIEW
http://dx.doi.org/10.70034/ijmedph.2026.2.93
Habiba Mohammed Usman Dudhmal
View Abstract
Background: Euglycemic diabetic ketoacidosis (EDKA) is a variant of diabetic ketoacidosis marked by metabolic acidosis and ketosis despite normal or only mildly elevated blood glucose levels. In contrast to classical diabetic ketoacidosis (DKA), the lack of significant hyperglycemia can obscure recognition and delay timely management. The growing use of sodium–glucose cotransporter-2 (SGLT2) inhibitors, together with triggers such as infection, prolonged fasting, pregnancy, and missed insulin doses, has led to an increasing number of reported cases. Therefore, a systematic review is needed to integrate current evidence and strengthen clinical recognition and understanding of this emerging condition. The objective is to comprehensively evaluate the demographic and clinical characteristics, precipitating factors, management strategies, and outcomes—including recovery, complications, intensive care requirement, and mortality—of patients diagnosed with euglycemic diabetic ketoacidosis. Materials and Methods: This study was conducted as a systematic review in accordance with PRISMA guidelines. A comprehensive literature search was performed in PubMed/MEDLINE, Scopus, Web of Science, Embase, and the Cochrane Library using relevant keywords related to euglycemic diabetic ketoacidosis. Observational studies, case series, randomized controlled trials, and case reports, review articles, meta-analysis were included. Titles, abstracts, and full-text articles were independently screened according to predefined eligibility criteria. Results: Euglycemic diabetic ketoacidosis affects all age groups and is increasingly linked to SGLT2 inhibitor use. It presents with milder acidosis and lower glucose levels, which may delay diagnosis. Triggers include reduced intake, infection, pregnancy, insulin omission, surgical stress, low-carbohydrate diets, and Sodium–Glucose Cotransporter-2 inhibitors (SGLT2 inhibitors). Outcomes are comparable to classical DKA with appropriate treatment, though hypoglycemia during therapy is more frequent. Early recognition and prompt management are essential to reduce complications. Management included fluid resuscitation, electrolyte correction, insulin infusion, and dextrose supplementation. Euglycemic DKA required shorter insulin duration but had higher hypoglycemia risk. Mortality was under 5% with standardized treatment. Conclusion: Euglycemic DKA is a serious diabetic emergency that may be overlooked due to near-normal glucose levels and is increasingly associated with SGLT2 inhibitor use, particularly in perioperative settings. Early recognition, patient education, routine ketone monitoring, and preventive strategies such as the STICH and STOP DKA protocols are essential to reduce risk. Prompt management and structured follow-up remain key to preventing recurrence and improving outcomes. Keywords: Euglycemic diabetic ketoacidosis, SGLT2 inhibitors, Diabetic ketoacidosis, Type 1 diabetes mellitus, Insulin therapy, Clinical profile, Management outcomes.
Page No: 551-556 | Full Text
Original Research Article
A POSTMORTEM CHRONICLE OF HANGING DEATHS IN A DECADAL SOUTHERN INDIAN STUDY (2011-2020)
http://dx.doi.org/10.70034/ijmedph.2026.2.94
Shodhan Rao Pejavar, Prashantha Bhagavath, Tanush Shetty, Rashmi R Aithal, Rishab Shetty, Narasimha Pai D, Arun Pinchu Xavier, Francis Nanda Prakash Monteiro
View Abstract
Hanging is one of the common methods employed for suicide worldwide, accounting for around 55.6% of all suicides globally, with an estimated 700,000 deaths annually. The incidence is particularly high in Asia, where hanging is responsible for nearly 60% of suicides, influenced by sociocultural, economic, and legal factors. This retrospective descriptive study aims to analyse the demographic, circumstantial and forensic characteristics of hanging deaths autopsied at a tertiary medical institution in southern India over ten years (2011–2020). A total of 169 confirmed cases of hanging deaths were examined using archived post-mortem reports, police inquest documents, and victim profiles. Data on demographics, scene findings, external and internal autopsy findings, and inferred reasons for suicide were collected and analysed descriptively using SPSS. Victims ranged in age from 11 to 63 years, predominantly affecting those aged 20–49 years (74.56%). Males accounted for 63.91% of cases. Most hangings occurred at home (86.98%) and involved complete suspension (75.74%). Rope was the most common ligature material (66.86%). External autopsy findings showed typical ligature mark characteristics, including atypical knot positioning (85.21%), running knots (73.96%), and oblique ligature marks (95.86%). Internal examination revealed sternocleidomastoid muscle haemorrhages in 78.70% of cases and hyoid bone fractures in 17.16%. The leading inferred reasons for suicide were relationship or marital conflicts (43.20%), mental illness (13.02%), and academic or professional pressure (12.43%). This study provides comprehensive forensic and demographic insights into hanging deaths in southern India, highlighting the predominance of young adults and males, the home setting, and relationship conflicts as major contributing factors. The findings underscore the need for culturally sensitive suicide prevention strategies and improved mental health services in the region. Keywords: Hanging, Ligature Mark, Mental Illness, Suicide.
Page No: 557-564 | Full Text
Original Research Article
IMPACT OF DISEASE DURATION, EXTENT, AND INFLAMMATORY SEVERITY ON COLORECTAL CANCER RISK IN INFLAMMATORY BOWEL DISEASE: A SYSTEMATIC REVIEW
http://dx.doi.org/10.70034/ijmedph.2026.2.95
Rani Vijayan
View Abstract
Background: Inflammatory Bowel Disease (IBD), including Ulcerative Colitis and Crohn’s Disease, is associated with an increased risk of Colorectal Cancer. The risk is believed to be influenced by several disease-related factors, particularly duration of illness, anatomical extent of colonic involvement, and severity of chronic inflammation. While multiple observational studies have explored these associations, the magnitude and consistency of this risk factors remain variably reported. A comprehensive systematic review is therefore warranted to synthesize current evidence and clarify the relative contribution of these factors to colorectal cancer risk in IBD patients. Objectives: The objective of this systematic review is to evaluate the impact of disease duration, extent of colonic involvement, and inflammatory severity on the risk of colorectal cancer in patients with inflammatory bowel disease (IBD), and to identify high-risk patient subgroups while synthesizing available evidence to inform surveillance strategies and risk stratification. Materials and Methods: This systematic review was conducted in accordance with the PRISMA guidelines. A comprehensive literature search was performed in electronic databases including PubMed, Scopus, and Web of Science. Observational studies (cohort and case–control studies), randomized control trials, review article, meta-analysis evaluating the association between disease duration, disease extent, inflammatory severity, and the risk of colorectal cancer in patients with inflammatory bowel disease were included. Titles, abstracts, and full texts were screened according to predefined eligibility criteria. Data extraction included the impact of disease duration, disease extent, and measures of inflammatory burden. The methodological quality of the included studies was assessed using appropriate risk-of-bias tools, and a qualitative synthesis was performed. Results: Longer disease duration, extensive colitis, and persistent or severe inflammation were associated with an increased risk of colorectal cancer (CRC) in patients with inflammatory bowel disease. The risk increases approximately 8–10 years after diagnosis and rises with longer disease duration, particularly in pancolitis. Additional risk factors include primary sclerosing cholangitis, early disease onset, and family history of colorectal cancer, while surveillance colonoscopy improves dysplasia detection. Conclusion: The risk of colorectal cancer (CRC) in ulcerative colitis increases with longer disease duration, extensive colitis, severe inflammation, dysplasia, and primary sclerosing cholangitis. Surveillance colonoscopy, including advanced techniques such as chromoendoscopy, improves early detection and may reduce CRC risk. Further research is needed to better understand disease mechanisms and develop improved biomarkers and preventive strategies. Keywords: Inflammatory Bowel Disease (IBD), Ulcerative Colitis, Crohn’s Disease, Colorectal Neoplasia, Chronic Inflammation, Disease Duration, Pancolitis, Surveillance Colonoscopy.
Page No: 565-571 | Full Text
Original Research Article
PREDICTORS OF OUTCOME OF NON INVASIVE VENTILATION IN ACUTE EXACERBATION OF CHRONIC OBSTRUCTIVE PULMONARY DISEASE: A PROSPECTIVE COHORT STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.96
Nini Navakumar, Jayaprakash B, Rajan D
View Abstract
Background: Non-invasive ventilation (NIV) is a proven modality for managing acute exacerbations of chronic obstructive pulmonary disease (COPD) with respiratory failure. It reduces morbidity, hospital stay, and mortality. However, failure of NIV and delayed intubation can worsen outcomes. The objective is to identify clinical and biochemical predictors of NIV outcome in patients with acute exacerbation of COPD (AECOPD) admitted with respiratory failure. Materials and Methods: This prospective cohort study was conducted in the Intensive Respiratory Care Unit, Department of Respiratory Medicine, Government Medical College, Trivandrum, from January 2017 to July 2018. Eighty-five AECOPD patients with pH 7.2-7.35 and/or PaCO₂ ≥45 mmHg treated with NIV were included. ABG parameters were analyzed at 0, 2, 4, 24, and 48 hours. Patients showing clinical and ABG improvement at 48 hours were categorized as successful; others requiring intubation or death were classified as failures. Results: NIV success was seen in 65 patients (76.4%). Poor Glasgow Coma Scale (GCS <8), serum bilirubin >1.2 mg/dL at admission, pH ≤7.3, and PaCO₂ ≥75 mmHg at 4 hours were independent predictors of NIV failure. Multivariate logistic regression identified pH ≤7.3 at 4 hours (OR 12.67; 95% CI 2.6-60.6; p=0.001), serum bilirubin >1.2 mg/dL (OR 8.3; 95% CI 1.2-54.7; p=0.02), GCS <8 (OR 4.46; 95% CI 1.0-19.5; p=0.04), and PaCO₂ ≥75 mmHg at 4 hours (OR 4.14; 95% CI 1.06-16.21; p=0.04) as significant predictors. Conclusion: Early reassessment of ABG after 4 hours of NIV initiation is essential. Persistently low pH, high PaCO₂, poor GCS, and elevated bilirubin levels predict NIV failure and can guide early intubation decisions. Keywords: ABG parameters, Bilirubin, ph, Hypercapnia, GCS, NIV failure predictors, Respiratory failure, Acute exacerbation, COPD.
Page No: 572-578 | Full Text
Original Research Article
EFFECT OF INTRAOPERATIVE LOCAL INFILTRATION ANESTHESIA ON POSTOPERATIVE PAIN FOLLOWING TOTAL KNEE REPLACEMENT: A COMPARATIVE STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.97
Y.Hari Prasad Reddy, P.Devanatha Reddy
View Abstract
Background: Total knee replacement is a commonly performed procedure for end-stage knee osteoarthritis, providing significant pain relief and functional improvement. However, postoperative pain remains a major concern, often affecting early mobilization, rehabilitation, and overall patient satisfaction. Effective pain control strategies are therefore essential. Intraoperative local infiltration anesthesia has emerged as a simple and effective technique aimed at reducing postoperative pain and opioid requirements. The objective is to evaluate the effect of intraoperative local infiltration anesthesia on postoperative pain and functional recovery in patients undergoing total knee replacement. Materials and Methods: This comparative study included patients undergoing total knee replacement, divided into two groups based on the use of intraoperative local infiltration anesthesia. One group received local infiltration anesthesia during surgery, while the control group underwent standard surgical procedure without infiltration. Postoperative pain was assessed using a standardized pain scoring system at predefined intervals. Secondary outcomes included time to mobilization, duration of hospital stay, and requirement of rescue analgesics. Patients were followed up during the immediate postoperative period for assessment of outcomes. Results: Patients who received intraoperative local infiltration anesthesia demonstrated significantly lower postoperative pain scores in the early postoperative period compared to the control group. The requirement for rescue analgesics was reduced in the infiltration group. Early mobilization was achieved more effectively, and patients in this group showed improved comfort during physiotherapy sessions. Duration of hospital stay was also shorter in patients receiving local infiltration anesthesia. No significant increase in complications was observed. Conclusion: Intraoperative local infiltration anesthesia is an effective and safe technique for reducing postoperative pain following total knee replacement. It facilitates early mobilization, decreases analgesic requirements, and improves overall patient recovery. Incorporation of this technique into standard perioperative pain management protocols can enhance postoperative outcomes. Keywords: Total knee replacement, Local infiltration anesthesia, Postoperative pain, Analgesia, Early mobilization, Orthopedic surgery
Page No: 579-584 | Full Text
Original Research Article
COMPARATIVE STUDY BETWEEN INTERLOCKING NAILING AND DYNAMIC COMPRESSION PLATING IN HUMERAL DIAPHYSEAL FRACTURES WITH RESPECT TO FUNCTIONAL AND SURGICAL OUTCOMES
http://dx.doi.org/10.70034/ijmedph.2026.2.98
P.Devanatha Reddy, Y.Hari prasad Reddy
View Abstract
Background: Fractures of the humeral diaphysis are commonly encountered in orthopedic practice and can be managed by both operative and non-operative methods. Among surgical options, dynamic compression plating and interlocking intramedullary nailing are widely used techniques. While plating provides stable fixation with direct fracture visualization, interlocking nailing offers a minimally invasive approach with preservation of soft tissue biology. However, there remains ongoing debate regarding the superiority of either technique in terms of functional recovery, union rates, and complication profile. The objective is to compare the functional and surgical outcomes of interlocking nailing and dynamic compression plating in the management of humeral diaphyseal fractures. Materials and Methods: This comparative study included patients with humeral shaft fractures managed surgically using either interlocking intramedullary nailing or dynamic compression plating. Patients were divided into two groups based on the surgical technique employed. Functional outcomes were assessed using standard scoring systems, while surgical outcomes were evaluated based on duration of surgery, intraoperative blood loss, time to union, and complications. Patients were followed up at regular intervals for clinical and radiological assessment. Results: Both treatment modalities achieved satisfactory fracture union. Interlocking nailing demonstrated advantages in terms of shorter operative time and reduced intraoperative blood loss. However, dynamic compression plating showed superior functional outcomes, particularly with respect to shoulder function and range of motion. Time to union was comparable between the two groups, although a slightly earlier union trend was observed in the plating group. Complication rates varied between the groups, with shoulder stiffness being more common in the nailing group and infection-related complications observed more frequently in the plating group. Conclusion: Both interlocking nailing and dynamic compression plating are effective methods for the management of humeral diaphyseal fractures. While interlocking nailing offers advantages in surgical parameters, dynamic compression plating provides better functional outcomes, especially in terms of shoulder mobility. The choice of procedure should be individualized based on fracture characteristics, patient factors, and surgeon expertise. Keywords: Humerus shaft fracture, Interlocking nailing, Dynamic compression plating, Functional outcome, Surgical outcome, Fracture union, Orthopedic trauma.
Page No: 585-590 | Full Text
Original Research Article
MRI PROTOCOL FOR EPILEPSY IMAGING IN THE DEVELOPING WORLD: A PRACTICAL PROPOSAL
http://dx.doi.org/10.70034/ijmedph.2026.2.99
Nilesh Kumar Sinha, Gaurav Raj, Rudra Prasad Ghosh, Shubhlaxmi Srivastava
View Abstract
Background: Epilepsy disproportionately affects low- and middle-income countries (LMICs), where infectious etiologies such as neurocysticercosis and tuberculomas are common. Standard MRI protocols are primarily optimized for non-infectious causes and may reduce diagnostic yield in these settings. Purpose: To evaluate a modified MRI epilepsy protocol incorporating post-contrast T1-weighted imaging and susceptibility-weighted imaging (SWI) for improved lesion detection in LMICs. Materials and Methods: In this retrospective study, 102 patients with epilepsy underwent 3T MRI using a protocol including 3D T1, T2, FLAIR, DWI, SWI, and post-contrast T1 sequences. Lesion types and sequence-wise detection were analyzed, and stepwise rank analysis assessed cumulative diagnostic yield. Results: Inflammatory granulomas were the most common lesions (39.2%). Post-contrast T1-weighted imaging showed the highest standalone detection (45.1%), particularly for infectious and neoplastic lesions. FLAIR and T2 sequences were essential for gliosis, demyelination, and mesial temporal sclerosis, while SWI was critical for detecting calcified granulomas and cavernomas. Combined use of T1+C and FLAIR detected 73.5% of lesions, increasing to 100% with additional sequences. Conclusion: Inclusion of post-contrast T1 and SWI significantly improves detection of epileptogenic lesions in LMICs. A region-specific MRI protocol enhances diagnostic yield and may improve epilepsy management in resource-limited settings. Keywords: MRI, epilepsy, granuloma, neurocysticercosis, tuberculoma, epilepsy surgery, epilepsy protocol.
Page No: 591-594 | Full Text
Original Research Article
CLINICOPATHOLOGICAL STUDY OF SALIVARY GLAND LESIONS WITH REFERENCE TO HISTOLOGICAL TYPES, AGE, SEX AND SITE DISTRIBUTION
http://dx.doi.org/10.70034/ijmedph.2026.2.100
Urmi Chakravarty Vartak, Shital Mahure, Shailesh Vartak, Shailaja Puri
View Abstract
Background: Salivary gland lesions comprise a diverse group of pathological conditions ranging from non-neoplastic inflammatory processes to benign and malignant neoplasms. Due to their varied histological patterns and clinical presentations, accurate diagnosis remains a challenge. The aim is to study the clinicopathological spectrum of salivary gland lesions with reference to histological types, age, sex, and site distribution. Materials and Methods: This hospital-based observational study included 79 cases of salivary gland lesions diagnosed over a defined study period at a tertiary care center. Clinical details such as age, sex, and site were recorded. Specimens were processed using standard histopathological techniques, and lesions were classified according to WHO guidelines. Statistical analysis was performed using appropriate tests, with p < 0.05 considered significant. Results: The mean age of patients was 38.9 ± 16.4 years, with the highest incidence in the 41–60 years age group (43.0%). There was no significant gender predilection, with males (50.6%) and females (49.4%) being almost equally affected. The parotid gland (62.0%) was the most commonly involved site. Benign lesions constituted the majority (49.4%), followed by non-neoplastic (39.2%) and malignant lesions (11.4%). Pleomorphic adenoma (44.3%) was the most common benign tumor, while mucoepidermoid carcinoma (10.1%) was the most frequent malignant lesion. Non-neoplastic lesions such as sialadenitis and abscess were also commonly observed. Conclusion: Salivary gland lesions show a wide clinicopathological spectrum with a predominance of benign tumors, particularly pleomorphic adenoma. The parotid gland is the most commonly affected site, and lesions are most frequent in middle-aged individuals. Histopathological examination remains essential for accurate diagnosis and appropriate management. Keywords: Salivary gland lesions. Pleomorphic adenoma. Mucoepidermoid carcinoma.
Page No: 595-601 | Full Text
Original Research Article
ANTHROPOMETRY AND HAEMATOLOGICAL PROFILE OF BABIES BORN TO MOTHERS WITH PRE-ECLAMPSIA VERSUS NORMAL MOTHERS
http://dx.doi.org/10.70034/ijmedph.2026.2.101
Aswathy Sunil, Adheena Mini Augustine, Bini Mariam Chandy
View Abstract
Background: Neonatal survival depends on a wide range of factors of which the birth weight, head circumference, hematological parameters at birth are key determinants. To tackle the high rates of neonatal deaths related to low birth weight, prematurity etc it is important to understand the factors leading to the same, of which maternal hypertension is one of the common cause. The objective is to compare the hematological profile and anthropometrical values of neonates born to mothers with preeclampsia and neonates born to healthy mothers. Materials and Methods: This was a cross sectional study design comprising 76 newborns who met the inclusion criteria. Neonates were grouped into neonates born to preeclamptic mothers and healthy mothers equally, 38 in each group. After birth immediately 2 ml of blood was collected from the umbilical cord into the vacutainer anti coagulated with EDTA and following parameters were studied- Hemoglobin, Total count, Differential Count, Platelet Count and the red cell indices – MCV, MCH, MCHC. The proforma was filled and all the values were collected and entered systematically. All categorical data were analysed with Chi square test. Results: On comparing the newborns born to mothers with preeclampsia versus normal mothers, the former group had higher hemoglobin levels, hematocrit values, lower leukocyte count and platelet count than the later group. MCV, MCH, MCHC were more or less similar in both the groups. Conclusion: Leukopenia, thrombocytopenia were found at a higher rate in newborns of hypertensive mothers. Newborns of hypertensive mothers carry a risk for complications including primarily infection and bleeding. Keywords: Preeclampsia, Neonatal Hematological Profile, Anthropometric Measurements in Newborns, Birth Weight and Head Circumference, Maternal Hypertension, Neonatal Leukopenia.
Page No: 602-605 | Full Text
Original Research Article
A STUDY OF TISSUE DOPPLER IMAGING FOR ASSESSMENT OF LEFT VENTRICULAR DYSFUNCTION IN CORONARY ARTERY DISEASE
http://dx.doi.org/10.70034/ijmedph.2026.2.102
Bilal Khan, Prashant Udgire
View Abstract
Background: Coronary artery disease (CAD) remains the leading cause of morbidity and mortality worldwide. Early detection of left ventricular (LV) systolic and diastolic dysfunction is crucial for prognostic stratification and therapeutic decision-making. Conventional Doppler echocardiography assesses LV function using parameters such as transmittal E/A ratio and left ventricular ejection fraction (LVEF). However, Tissue Doppler Imaging (TDI) provides direct quantitative assessment of myocardial velocities and may detect subclinical dysfunction earlier. The objective is to determine left ventricular mitral annular systolic (Sa) and early diastolic (Ea) velocities using TDI and compare them with conventional Doppler echocardiographic parameters in patients with CAD. Materials and Methods: This hospital-based observational study included 60 patients (39 males, 21 females) with newly or previously diagnosed CAD admitted to Khaja Banda Nawaz Teaching and General Hospital, Kalaburagi. All subjects underwent standard two-dimensional echocardiography, conventional pulsed-wave Doppler, and pulsed-wave TDI using a GE LOGIQ F8 system. Mitral annular velocities were recorded from the lateral annulus in the apical four-chamber view. Peak systolic (Sa), early diastolic (Ea), and late diastolic (Aa) velocities were measured and compared with conventional parameters including LVEF and transmitral E/A ratio. Statistical significance was defined as p < 0.05. Results: TDI-derived Ea detected diastolic dysfunction in 75% of CAD patients compared to 60% detected by conventional E/A ratio (p < 0.001). For systolic dysfunction, conventional LVEF identified 83.3% of cases, whereas Sa velocity detected 61.7%. Ea showed superior sensitivity in identifying diastolic dysfunction, while LVEF remained a stronger predictor of systolic dysfunction. Conclusion: TDI-derived Ea is a more sensitive parameter for detecting LV diastolic dysfunction in CAD patients compared to conventional Doppler E/A ratio. However, LVEF remains superior to Sa velocity in identifying systolic dysfunction. TDI parameters (Ea and Sa) provide valuable complementary diagnostic and prognostic information in CAD patients. Keywords: Coronary artery disease; Tissue Doppler Imaging; Left ventricular dysfunction; Mitral annular velocity; Ejection fraction.
Page No: 606-611 | Full Text
Original Research Article
CLINICAL PROFILE, EPIDEMIOLOGY, AND OUTCOME OF PATIENTS WITH CHLORPYRIFOS COMPOUND POISONING ADMITTED IN KIMS TEACHING HOSPITAL,KOPPAL: A PROSPECTIVE OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.103
Bhagath Kumar V, Umesh G Rajoor, Tanveer Ahmed T M, Gavisiddesh V Ronad
View Abstract
Background: Chlorpyrifos is a chlorinated organophosphate compound commonly implicated in pesticide poisoning in India. This study aimed to evaluate the clinical profile, epidemiology, and outcomes of patients with chlorpyrifos poisoning admitted to KIMS teaching hospital koppal. Materials and Methods: This prospective observational study was conducted at Koppal Institute of Medical Sciences from August 2024 to July 2025. Fifty-four patients with confirmed chlorpyrifos poisoning were enrolled. Demographic data, clinical features, laboratory parameters, treatment details, and outcomes were analyzed using descriptive statistics, chi-square test, and independent t-test. Results: The mean age was 32.10±9.95 years with male predominance (79.6%). Majority were suicidal cases (92.6%) among farmers (63.0%). Common clinical features included altered sensorium (48.1%), sweating (40.7%), and excess salivation (27.8%). Mortality rate was 33.3% among treated patients. Significant predictors of mortality included low GCS (p<0.001), high POP score (p<0.001), low SpO2 (p<0.001), mechanical ventilation requirement (p=0.002), respiratory distress (p=0.004), altered sensorium (p=0.002), and intermediate syndrome (p=0.008). Conclusion: Chlorpyrifos poisoning carries significant mortality. Early identification of high-risk patients using GCS and POP scoring with prompt intensive care may improve outcomes. Keywords: Chlorpyrifos; Organophosphate poisoning; Intermediate syndrome; Pseudocholinesterase; Mortality.
Page No: 612-616 | Full Text
Original Research Article
PROFILE OF SUPPURATIVE CORNEAL ULCER
http://dx.doi.org/10.70034/ijmedph.2026.2.104
Anil Kumar Verma, Anushree Gupta
View Abstract
Background: Purpose: To analyze the prevalence of bacterial and fungal corneal ulcer among diagnosed cases of suppurative corneal ulcer and to compare the role of direct microscopy and culture results in etiological diagnosis of suppurative corneal ulcer. Materials and Methods: A total of 31 patients with clinically diagnosed cases of suppurative corneal ulcer of all ages and either sex, who presented during a period of one year were studied. A detailed history and ocular examination was performed on each patient. All these corneal ulcers were scraped for direct microscopy and culture. Results: In this study 19 cases (61.3 %) were culture positive. In 15 cases (48.4%) pure bacterial growth was observed, in 3 cases (9.7%) pure fungal growth was observed. In one case both bacterial & fungal growth was observed. The most common bacterial isolate detected was Staphylococcus aureus. Most common fungal isolate detected was Acremonium species. Analysis of KOH wet-mount and Gram-stained smear was done using culture as gold-standard, sensitivity and specificity of KOH wet mount was 25% and 92.6% respectively. Sensitivity and specificity of Gram stain smear was 33.3 % and 95.5 % respectively. Conclusion: We conclude that suppurative corneal ulcer is a major cause of preventable monocular blindness and educational strategies can reduce avoidable risk such as trauma, but treatment protocols are required to manage established disease Keywords: Corneal ulcer, hypopyon, bacterial, fungal, stromal, conjunctival, microscopy, culture.
Page No: 617-621 | Full Text
Case Report
HYPONATREMIC HYPERTENSIVE SYNDROME WITH RENAL ISCHEMIA SECONDARY TO TAKAYASU VASCULITIS: A RARE PEDIATRIC CASE
http://dx.doi.org/10.70034/ijmedph.2026.2.105
Himanshi Aggarwal, Prasun Bhattacharje, Aditya Bhattacharjee
View Abstract
Takayasu arteritis is a chronic inflammatory large-vessel vasculitis predominantly affecting the aorta and its major branches, commonly seen in young females and frequently involving the renal arteries in pediatric patients from India and the Far East. Hyponatremic hypertensive syndrome is a rare but important clinical entity characterized by severe hypertension, hyponatremia, and polyuria, typically driven by unilateral renal ischemia with excessive renin secretion and pressure natriuresis from the contralateral kidney; early recognition is essential to prevent complications. This report describes a 4-year-old girl admitted to the pediatric intensive care unit with multiple afebrile convulsions and a history of polyuria and polydipsia, found to have absent bilateral brachial and radial pulses, higher lower-limb than upper-limb blood pressure, and grade 1 hypertensive retinopathy. Laboratory evaluation demonstrated persistent hyponatremia with polyuria and normal serum creatinine. Brain MRI showed multiple hypodensities, while CT angiography revealed multiple stenotic lesions of the right and left subclavian arteries and left vertebral artery, along with significant right renal artery stenosis and a small right kidney; echocardiography demonstrated left ventricular hypertrophy consistent with chronic hypertension. These findings supported a diagnosis of Takayasu arteritis complicated by renal artery stenosis and hyponatremic hypertensive syndrome, likely mediated by RAAS activation, contralateral pressure diuresis, and ADH-related volume effects exacerbating hyponatremia. Treatment with corticosteroids and methotrexate led to clinical improvement with normalization of blood pressure and serum sodium; however, repeat CT angiography showed progression of renal artery stenosis, prompting planning for renal angioplasty. The case underscores that refractory stage 2 hypertension with persistent hyponatremia in children should prompt evaluation for renal artery stenosis and vasculitis, with careful pulse examination and early imaging-based diagnosis to enable timely immunosuppression and, when needed, vascular intervention to reduce life-threatening complications and long-term morbidity. Keywords: Hyponatremic hypertensive syndrome (HHS), Takayasu arteritis (TA), renal artery stenosis, pediatric hypertension, EULAR.
Page No: 622-624 | Full Text
Original Research Article
CLINICAL PROFILE, ETIOLOGY, COMORBIDITIES, AND TREATMENT PATTERNS AMONG PATIENTS WITH CHRONIC KIDNEY DISEASE ATTENDING A TERTIARY CARE HOSPITAL IN ODISHA: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.106
Ramakanta Panda, Lopamudra Das, Shakti Prakash Mishra
View Abstract
Background: Chronic kidney disease (CKD) represents an increasing public health and clinical challenge in India, driven largely by the rising burden of non-communicable diseases such as diabetes and hypertension. Delayed diagnosis and limited access to renal replacement therapy contribute to significant morbidity, mortality, and financial hardship. The objective is to assess the clinical profile, etiology, comorbidities, treatment status, and direct treatment-related costs among CKD patients attending a tertiary care hospital in Odisha. Materials and Methods: A hospital-based cross-sectional study was conducted among 180 adult CKD patients attending the nephrology outpatient department and dialysis unit of PGIMER & Capital Hospital, Bhubaneswar, Odisha. Data on sociodemographic characteristics, CKD etiology, comorbidities, and treatment modalities were collected using a semi-structured schedule and verified through medical records. Descriptive statistics and Chi-square tests were used for analysis. Institutional ethical approval and informed consent were obtained. Results: The mean age of patients was 54.1 ± 13.8 years, with males comprising 72.2%. Diabetic nephropathy (40.0%) and hypertensive nephropathy (27.2%) were the leading etiologies. Hypertension (61.7%) and type 2 diabetes mellitus (48.9%) were the most common comorbidities. Advanced CKD (Stages 4–5) constituted 61.1% of the study population. Hemodialysis was the predominant treatment modality (35.0%), while peritoneal dialysis (3.3%) and renal transplantation (3.3%) were less frequently utilized. A majority of dialysis services were accessed from private hospitals. Median monthly expenditure for dialysis was ₹10,500, in addition to transportation and medication costs. Utilization of financial protection schemes was partial. The study highlights a substantial burden of CKD among individuals in their productive years, with diabetes and hypertension as key etiological and comorbid factors. Limited public sector dialysis capacity and reliance on private services contribute to financial stress among patients and families. Conclusion: Strengthening early CKD detection in primary care, improving NCD management, expanding renal replacement capacity, and enhancing financial protection mechanisms are essential to reduce the clinical and economic burden of CKD in resource-constrained settings. Keywords: Chronic kidney disease, Diabetes mellitus, Hypertension, Comorbidity, Hemodialysis, Renal replacement therapy, Renal transplantation, Financial burden.
Page No: 625-630 | Full Text
Original Research Article
TO COMPARE THE EFFECTIVENESS OF DAILY VERSUS ALTERNATE DAY ORAL IRON THERAPY FOR OBSTETRIC PATIENTS WITH IRON DEFICIENCY ANAEMIA: RANDOMIZED CONTROLLED TRAIL
http://dx.doi.org/10.70034/ijmedph.2026.2.107
Kanika Bansal, Chandana C, Sushma R
View Abstract
Background: Iron deficiency anaemia remains a major public health concern among obstetric patients, with significant maternal and fetal consequences. Conventional daily oral iron therapy is often limited by poor compliance due to gastrointestinal side effects. Alternate-day dosing has been proposed to improve iron absorption and tolerability. This study aimed to compare the effectiveness of daily versus alternate-day oral iron therapy in obstetric patients with iron deficiency anaemia. Materials and Methods: This randomized controlled trial was conducted on 46 obstetric patients with haemoglobin <11 g/dL, allocated into two groups: daily (n=23) and alternate-day (n=23) oral iron therapy (100 mg elemental iron). Patients were followed for 8 weeks. Haemoglobin, haematological indices, serum ferritin, gastrointestinal side effects, and compliance were assessed. Statistical analysis was performed using appropriate tests, with p < 0.05 considered significant. Results: Both groups showed significant improvement in haemoglobin levels from baseline to 8 weeks (p < 0.001). At 8 weeks, haemoglobin was higher in the alternate-day group (11.16 ± 1.15 g/dL) compared to the daily group (10.73 ± 1.07 g/dL), though not statistically significant (p > 0.05). Serum ferritin improved in both groups without significant intergroup difference. Gastrointestinal side effects, particularly epigastric pain and constipation, were significantly higher in the daily group (p < 0.05). Compliance was significantly better in the alternate-day group (85.15% vs 72.49%, p = 0.0006). Conclusion: Alternate-day oral iron therapy is as effective as daily therapy in improving haemoglobin and iron stores, with better tolerability and compliance, making it a suitable alternative in obstetric patients. Keywords: Iron deficiency anaemia, pregnancy, oral iron therapy, alternate-day dosing, haemoglobin, compliance, randomized controlled trial
Page No: 631-636 | Full Text
Original Research Article
MICROBIAL FLORA OF HEALTH CARE WORKERS ACCESSORIES - A POTENTIAL SOURCE OF HEALTHCARE-ASSOCIATED INFECTIONS
http://dx.doi.org/10.70034/ijmedph.2026.2.108
K. Snehaa, Sukriti Singh, Vineeta, Shwetha V R, Ruby Thomas
View Abstract
Background: Healthcare-associated infections (HCAIs) pose a significant risk to hospitalized patients, with healthcare workers (HCWs) and their accessories often acting as potential sources of transmission. This study aimed to assess the microbial contamination of common HCWs accessories, such as mobile phones, stethoscopes, pens, and finger rings, to better understand their role in spreading infections and antimicrobial resistance. Materials and Methods: This cross-sectional study was conducted from September to October 2023 at Andaman and Nicobar Islands Institute of Medical Sciences, Sri Vijayapuram. A total of 63 HCWs (52 doctors and 11 nurses) participated. Samples were collected from mobile phones, stethoscopes, finger rings, and pens using sterile cotton-tipped swabs. The samples were processed for microbial growth as per the standard bacteriological protocols, followed by antimicrobial susceptibility testing using the Kirby-Bauer disc diffusion method. Results: Among the 193 samples collected, 76.1% were contaminated with pathogens, with mobile phones (84.1%) showing the highest contamination rate, followed by finger rings (79.2%), stethoscopes (74.4%), and pens (68.3%). The predominant microorganisms isolated were Coagulase Negative Staphylococcus (CONS) (55.1%) and Staphylococcus aureus (42.1%). A significant proportion of these pathogens were resistant to common antibiotics, with 48.38% of S. aureus being methicillin-resistant (MRSA) and 44.44% of CONS being methicillin-resistant (CONS-MR). Conclusion: This study highlights the high prevalence of bacterial contamination of HCWs accessories, particularly mobile phones. It underscores the need for routine disinfection of these items and improved hand hygiene practices to prevent the transmission of harmful pathogens in hospital settings. The findings also suggest the potential role of these accessories in the spread of antimicrobial-resistant organisms, warranting stricter infection control protocols in healthcare settings. Keywords: Healthcare-associated infections, healthcare workers, microbial contamination, mobile phones, stethoscopes, finger rings, pens, antimicrobial resistance
Page No: 637-643 | Full Text
Original Research Article
EXPLORING THE LINK BETWEEN GLYCEMIC STATUS AND PARATHYROID HORMONE IN TYPE 2 DIABETES
http://dx.doi.org/10.70034/ijmedph.2026.2.109
Zeba Tanveer, Katta Vani, B Aradhana, Geetha Meenakshi
View Abstract
Background: Parathyroid hormone (PTH) is composed of 84 amino acids secreted by parathyroid cells. It plays a vital role in mineral and bone metabolism by promoting bone resorption, inhibiting urinary calcium (Ca) loss and accelerating vitamin D activation.[1] High blood glucose non-enzymatically glycates haemoglobin at numerous sites. The process of Glycation, takes 120 days. So, this property of Hb is used to track the average amount of glucose in the blood and measures a patient's glycaemic state for the past 3 months.[2] Objectives: 1. To measure HbA1c levels in both cases and controls. 2. To measure PTH levels in cases and controls. 3. To correlate the association of HbA1c and PTH among cases and controls. Materials and Methods: A case control study on 100 patients, out of which 50 cases of type 2 diabetes mellitus and 50 normal healthy individuals. HbA1c levels were measured using HPLC BIORAD D10 and PTH levels were measured using CLIA SIEMENS ADVIA CENTAUR XP. Spearman’s rho was used to assess the correlation between HbA1c and PTH in cases and pearsons correlation analysis was done to assess the correlation between Hba1c and PTH in controls. Results: Hundred participants (50 type 2 diabetes patients and 50 age- and sex- matched controls) were studied. HbA1c and PTH differed significantly between groups, being higher in cases. Correlations between HbA1c and PTH were weak in both groups, with large (HbA1c) and moderate (PTH) effect sizes. Higher PTH levels in higher quartiles. Conclusion: Patients with type 2 diabetes showed significantly higher HbA1c and PTH levels than controls. However, weak correlation existed between these variables, suggesting multifactorial regulation of PTH beyond glycemia. Elevated PTH may contribute to skeletal complications in diabetes, warranting further large- scale longitudinal studies. Keywords: HbA1c, PTH, Duration of diabetes mellitus.
Page No: 644-648 | Full Text
Original Research Article
EVALUATING THE IMPACT OF AN EDUCATIONAL INTERVENTION ON KNOWLEDGE OF THE ‘PROTECTION OF CHILDREN FROM SEXUAL OFFENCES’ (POCSO) ACT AMONG HIGH SCHOOL STUDENTS
http://dx.doi.org/10.70034/ijmedph.2026.2.110
Keerthiga S, N. Banerji, S. Sunitha
View Abstract
Background: Child sexual abuse (CSA) remains a major public health concern globally and in India. According to UNICEF (2024), 1 in 5 girls and 1 in 7 boys experience sexual violence during childhood. The National Crime Records Bureau (2022) reported that only 39.7% of crimes against children were registered under the POCSO Act. Limited awareness of laws, social inequalities, and cultural norms contribute significantly to vulnerability. School-based educational interventions can empower children with knowledge on safety, reporting mechanisms, and legal protection. Materials and Methods: This study was conducted among 92 students studying in classes 8th–10th in a government school under the UHTC field practice area of Tirupati. Students willing to participate with assent and parental consent were included. The questionnaire validated by legal experts was used to assess pre- and post-intervention knowledge. The educational intervention included two awareness sessions on the POCSO Act. Statistical analysis included paired t-tests and chi-square tests. Results: Participants were predominantly male (70.7%). Pre-test results showed only 11 students (12%) had adequate knowledge. Post-intervention, a significant increase in knowledge for 56 students (60.9%). In terms of mean, there has been significant improvement in knowledge score, improving from 7.15 to 10.79 (p < 0.000), reflecting the impact of education intervention. Conclusion: The educational intervention significantly improved awareness and understanding of the POCSO Act among high school students. Incorporating legal-awareness programs into school health initiatives is crucial to empower adolescents, enhance reporting, and prevent child sexual abuse. Keywords: POCSO, education intervention, knowledge assessment, child sexual abuse, awareness, high school students, Tirupati.
Page No: 649-654 | Full Text
Case Report
MULTIDISCIPLINARY APPROACH TO SALVAGING A PRECIOUS DIABETIC LIMB WITH CONTRALATERAL LIMB CONSIDERATIONS: A CASE REPORT
http://dx.doi.org/10.70034/ijmedph.2026.2.111
Premkumar E, Aravind P
View Abstract
Background: Diabetic foot ulcers are a leading cause of non-traumatic lower limb amputations, particularly in patients with poor glycemic control and peripheral vascular disease. Limb preservation becomes critically important when the contralateral limb is already compromised, as further amputation can result in significant functional disability. Case Presentation: We present the case of a 56-year-old male with long-standing uncontrolled diabetes mellitus who developed an infected ulcer over the right foot. The patient had a prior history of contralateral forefoot amputation, rendering the affected limb functionally critical Clinical evaluation revealed a deep ulcer with surrounding cellulitis, peripheral neuropathy, and reduced distal perfusion. Laboratory investigations showed poor glycemic control and elevated inflammatory markers. Imaging confirmed soft tissue infection without advanced osteomyelitis. A multidisciplinary approach involving diabetology, surgery, vascular assessment, and wound care was implemented. The patient underwent prompt surgical debridement, culture-directed antibiotic therapy, strict glycaemic control, and structured wound care with appropriate offloading strategies. Conclusion: This case highlights the importance of early multidisciplinary intervention in salvaging a high-risk diabetic limb, especially in the presence of contralateral limb compromise. Coordinated care can effectively prevent major amputation and preserve functional independence. Keywords: Diabetic foot ulcer; Limb salvage; Multidisciplinary care; Peripheral arterial disease; Wound management.
Page No: 655-659 | Full Text
Original Research Article
PATTERNS OF VACCINATION COVERAGE AT THE MODEL IMMUNIZATION CENTER: INSIGHTS FROM SIMS/SERVICES HOSPITAL LAHORE
http://dx.doi.org/10.70034/ijmedph.2026.2.112
Zohra Khanum, Fatima Tahira
View Abstract
Background: Vaccination is a cornerstone of public health, effectively reducing morbidity and mortality from infectious diseases. In Pakistan, the Expanded Program on Immunization (EPI) targets multiple vaccine-preventable diseases, yet significant disparities persist due to socioeconomic, geographic, and cultural barriers. Urban centers like the Model Immunization Center at SIMS/Services Hospital Lahore provide an opportunity to assess immunization patterns and inform strategies for improving vaccine uptake. Methodology: A retrospective, observational study was conducted at the Model Immunization Center over a four-month period (September 26, 2024, to January 18, 2025). Vaccination records were reviewed and categorized by type (routine, traveler’s, emergency, and COVID-19), shift (morning, evening, night), and gender. Descriptive statistics were used to analyze trends in coverage. Results: Out of 12,200 total vaccinations, traveler’s vaccines accounted for 9,500 (77.8%), followed by routine (1,200; 9.8%), emergency (800; 6.6%), and COVID-19 vaccines (700; 5.7%). The majority were administered during the morning shift (60–80%), indicating time-of-day preference. Males received a higher proportion of all vaccine types (55–70%), particularly in traveler’s and COVID-19 categories. Routine and emergency vaccinations showed moderate uptake, while COVID-19 vaccination coverage remained relatively low despite ongoing public health efforts. Conclusion: The study reveals distinct vaccination patterns at an urban immunization center, highlighting the high demand for traveler’s vaccines and moderate uptake of routine and emergency immunizations. Gender disparities and limited uptake during evening and night shifts suggest the need for more inclusive and flexible vaccination strategies. Key words: Vaccination, EPI, immunization.
Page No: 660-664 | Full Text
Original Research Article
A COMPARATIVE STUDY OF NASOGASTRIC VERSUS NASOJEJUNAL FEEDING IN CRITICALLY ILL PATIENTS ADMITTED TO A TRAUMA CARE UNIT
http://dx.doi.org/10.70034/ijmedph.2026.2.113
Sagar Gawali, Aditya Patil, Renuka Purohit
View Abstract
Background: Enteral nutrition is a cornerstone in the management of critically ill patients. While nasogastric (NG) feeding is commonly used, it is associated with complications such as feed intolerance and aspiration. Nasojejunal (NJ) feeding may offer advantages by bypassing the stomach and improving nutrient delivery. The aim is to compare the efficacy and safety of nasogastric and nasojejunal feeding in critically ill patients admitted to a trauma care unit. Materials and Methods: This prospective observational study was conducted in a trauma care unit over a period of two years. A total of 120 critically ill patients requiring enteral nutrition were included and divided equally into NG (n=60) and NJ (n=60) feeding groups. Parameters assessed included time to achieve target nutrition, gastric residual volume, serum albumin levels, achievement of feeding goals, complications, ICU stay, and mortality. Results: The NJ group achieved target nutrition significantly faster than the NG group (23.6 ± 6.8 vs 35.6 ± 3.4 hours, p<0.05). Gastric residual volume was significantly lower in the NJ group (517.8 ± 107.9 vs 917.4 ± 155.3 mL, p<0.05). Post-feeding serum albumin levels were significantly higher in the NJ group (3.76 ± 1.05 vs 3.12 ± 0.78 g/dL, p<0.05), and a higher proportion achieved feeding goals (88.3% vs 65.0%, p<0.05). No significant differences were observed in ICU stay, mortality, or complication rates between the groups. Conclusion: Nasojejunal feeding improves nutritional delivery and feeding efficiency compared to nasogastric feeding but does not significantly affect clinical outcomes such as mortality or ICU stay. It may be preferred in patients at high risk of feed intolerance. Keywords: Enteral nutrition; Nasogastric feeding; Nasojejunal feeding; Critical care; Trauma ICU.
Page No: 665-669 | Full Text
Original Research Article
COMPARATIVE STUDY OF MACHINE LEARNING–BASED PREDICTION MODELS VERSUS CONVENTIONAL CLINICAL RISK ASSESSMENT FOR POSTPARTUM HEMORRHAGE
http://dx.doi.org/10.70034/ijmedph.2026.2.114
Arushi Sharma, Arihant Sharma, Advaita Sharma
View Abstract
Background: Postpartum hemorrhage (PPH) remains one of the leading causes of maternal morbidity and mortality worldwide and continues to pose a major challenge in obstetric practice, particularly in tertiary care settings where high-risk pregnancies are frequently managed. Conventional clinical risk assessment is routinely used to identify women at risk; however, its predictive ability may be limited because of the multifactorial nature of PPH. With the growing use of digital health records and advanced analytics, machine learning–based prediction models have emerged as a promising approach for improving early identification of high-risk cases. Aim: To compare the predictive performance of machine learning–based prediction models with conventional clinical risk assessment for postpartum hemorrhage among women delivering at a tertiary care hospital. Materials and Methods: This hospital-based comparative observational study was conducted in the Department of Obstetrics and Gynecology at a tertiary care hospital. A total of 70 pregnant women admitted for vaginal delivery or cesarean section were included. Data were collected using a structured proforma from patient history, clinical examination, case records, labor room notes, operative records, and laboratory investigations. Demographic, obstetric, clinical, and delivery-related variables were recorded, including age, parity, body mass index, anemia, hypertensive disorders, previous cesarean section, placental abnormalities, induction of labor, prolonged labor, mode of delivery, macrosomia, and uterine atony. All patients were assessed by both conventional clinical risk assessment and machine learning–based prediction methods. Data were entered in Microsoft Excel and analyzed using SPSS version 27.0. Descriptive and comparative statistical analyses were performed, and model performance was compared using sensitivity, specificity, positive predictive value, negative predictive value, accuracy, and area under the receiver operating characteristic curve (AUROC). Results: Out of 70 patients, 14 (20.00%) developed postpartum hemorrhage. Significant factors associated with PPH included obesity (50.00% vs 19.64%, p=0.024), previous cesarean section (50.00% vs 21.43%, p=0.036), anemia (57.14% vs 21.43%, p=0.011), hypertensive disorders of pregnancy (35.71% vs 14.29%, p=0.049), placenta previa/abruption (28.57% vs 8.93%, p=0.041), prolonged labor (42.86% vs 16.07%, p=0.029), and uterine atony (64.29% vs 10.71%, p<0.001). The machine learning model outperformed conventional clinical risk assessment with higher sensitivity (85.71% vs 64.29%), specificity (89.29% vs 73.21%), positive predictive value (66.67% vs 37.50%), accuracy (88.57% vs 71.43%), and AUROC (0.91 vs 0.74). Conclusion: Machine learning–based prediction models demonstrated superior performance over conventional clinical risk assessment in predicting postpartum hemorrhage. Their use may enhance early risk stratification, clinical preparedness, and timely intervention in tertiary obstetric care. Keywords: Postpartum hemorrhage; machine learning; clinical risk assessment; prediction model; tertiary care hospital.
Page No: 670-677 | Full Text
Original Research Article
EVALUATION OF PREOPERATIVE DEXMEDETOMIDINE NEBULISATION ON THE PRESSOR RESPONSE TO LARYNGOSCOPY AND INTUBATION
http://dx.doi.org/10.70034/ijmedph.2026.2.115
Mohit Gedekar, Rahul Saxena, Geeta Bhandari, Jahanara, Deep Chilana
View Abstract
Background: Direct laryngoscopy and endotracheal intubation trigger a sympathetic surge, leading to increases in heart rate and blood pressure, which can be risky in patients with cardiovascular or neurological comorbidities. Dexmedetomidine, a selective α2-adrenergic agonist, has been shown to provide sedation, analgesia, and hemodynamic stability. Nebulized administration offers a non-invasive, patient-friendly alternative. Materials and Methods: This prospective, double-blind, randomized study included 88 adult patients (ASA I–II) undergoing elective surgery under general anesthesia. Patients were randomly allocated to receive nebulized dexmedetomidine (1 µg/kg, n=44) or saline (5 mL, n=44) 30 minutes before induction. Heart rate, blood pressure, SpO₂, propofol requirement, sedation scores (Ramsay), and side effects were recorded from pre-nebulization to 10 minutes post-intubation. Results: Baseline demographics were comparable. Dexmedetomidine significantly attenuated the rise in heart rate, systolic, diastolic, and mean arterial pressures during induction and intubation (p < 0.05). Propofol requirement was lower (84.45 ± 8.51 mg vs 101.75 ± 16.7 mg, p < 0.001), and Ramsay Sedation Scores were higher (3.18 ± 0.62 vs 1.93 ± 0.54, p < 0.001) in the dexmedetomidine group. Minor side effects, including bradycardia (6.9%) and hypotension (4.6%), were observed, with no significant differences compared to saline. Oxygen saturation remained stable. Conclusion: Preoperative nebulized dexmedetomidine is a safe, effective, and non-invasive method to attenuate the pressor response to laryngoscopy and intubation, reduce anesthetic requirement, and enhance sedation without compromising safety. Keywords: Dexmedetomidine, Nebulization, Laryngoscopy, Hemodynamic, General Anesthesia.
Page No: 678-682 | Full Text
Original Research Article
COMPARATIVE ANALYSIS OF SERUM ELECTROLYTE PROFILE AND SERUM URIC ACID LEVELS IN DIAGNOSED HYPERTENSIVE AND NORMOTENSIVE PATIENTS”
http://dx.doi.org/10.70034/ijmedph.2026.2.116
Devaki R.N, Prajna K, Vinay Kumar K, Naresh, Mahesh V
View Abstract
Background: Hypertension is a major global health concern and a leading risk factor for cardiovascular morbidity and mortality. Alterations in serum electrolyte levels and uric acid have been implicated in the pathophysiology of hypertension, influencing vascular tone, fluid balance, and renal function. Understanding these biochemical changes may aid in improving disease management and prevention strategies. The objective is to compare serum electrolyte profile (sodium, potassium, and chloride) and serum uric acid levels between hypertensive and normotensive individuals, and to assess the prevalence of associated biochemical abnormalities. Materials and Methods: This case-control study included 126 participants, comprising 63 hypertensive and 63 normotensive individuals. Baseline demographic and clinical data were recorded. Serum sodium, potassium, chloride, and uric acid levels were measured using standard laboratory methods. Statistical analysis was performed using the unpaired Student’s t-test, with a p-value < 0.05 considered statistically significant. Results: Hypertensive subjects demonstrated significantly higher mean serum sodium (140.75 ± 3.70 mmol/L), chloride (104.05 ± 2.70 mmol/L), and uric acid levels (4.47 ± 1.40 mg/dL) compared to normotensive individuals (135.08 ± 6.11 mmol/L, 99.62 ± 5.09 mmol/L, and 3.71 ± 1.14 mg/dL, respectively; p < 0.05). Serum potassium levels were significantly lower in hypertensive subjects (3.83 ± 0.56 mmol/L) than in normotensive controls (4.06 ± 0.48 mmol/L; p = 0.016). Additionally, the prevalence of hypernatremia, hypokalaemia, hyperchloremia, and hyperuricemia was higher among hypertensive individuals. Conclusion: Hypertension is associated with significant alterations in serum electrolyte levels and elevated uric acid. These findings highlight the importance of routine biochemical monitoring and suggest that correcting electrolyte imbalance and hyperuricemia may improve blood pressure control and reduce complications. Keywords: Hypertension; Electrolytes; Sodium; Potassium; Chloride; Uric acid; Hyperuricemia.
Page No: 683-686 | Full Text
Original Research Article
A STUDY OF VARIATIONS IN SUPRAMEATAL SPINE AND OTHER LANDMARKS ON THE LATERAL SURFACE OF TEMPORAL BONE AMONG CSOM PATIENTS
http://dx.doi.org/10.70034/ijmedph.2026.2.117
B.V.N.Muralidhar Reddy, George Pallapati, B. Zaiba kousar
View Abstract
Background: Chronic Suppurative Otitis Media (CSOM) is a common middle ear disease requiring surgical management. Identification of reliable anatomical landmarks, particularly the suprameatal spine (Henle’s spine), is essential for safe mastoidectomy. The objective is to evaluate the variations in the suprameatal spine and its relationship with lateral temporal bone landmarks in patients with CSOM. Materials and Methods: A hospital-based cross-sectional study was conducted on 120 patients with CSOM undergoing mastoidectomy at a tertiary care center. Intraoperative assessment included morphology of the suprameatal spine, mastoid pneumatization, ossicular status, and morphometric measurements of key anatomical landmarks. Data were analyzed using SPSS version 19. Results: The crest type of suprameatal spine was most common (68.3%), followed by triangular type (15%) and absence (16.7%). Absence of the spine was significantly associated with non-pneumatized mastoids (p < 0.05). The mean distance from Henle’s spine to the lateral semicircular canal and sinodural angle was 15.2 mm and 16.8 mm, respectively. Ossicular erosion was observed in 30% of cases, with the incus most commonly affected. Most patients had mild to moderate conductive hearing loss. Conclusion: Although the suprameatal spine is a valuable surgical landmark, its variability—especially in sclerotic mastoids—necessitates the use of multiple anatomical references during mastoidectomy. Understanding these variations can improve surgical safety and outcomes. Keywords: Chronic Suppurative Otitis Media; Suprameatal Spine; Henle’s Spine; Mastoidectomy; Hearing Loss.
Page No: 687-691 | Full Text
Original Research Article
ASSOCIATION BETWEEN SCREEN TIME AND BEHAVIOURAL HEALTH PROBLEMS AMONG TEENAGERS ATTENDING A TERTIARY CARE HOSPITAL IN LUCKNOW: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.118
Mehak Singh, Saksham Srivastava, Kamran Ahmad, Dhananjay Kumar
View Abstract
Background: The increasing penetration of digital media has significantly altered the lifestyle of teenagers, raising concerns regarding its potential impact on behavioural health. This study was conducted to estimate screen time and assess its association with behavioural health problems among teenagers aged 11–16 years. Materials and Methods: A hospital-based cross-sectional study was carried out from January to December 2024 in the pediatric outpatient department of a tertiary care hospital in Lucknow. A total of 159 teenagers were enrolled using convenience sampling. Screen time was assessed using a structured questionnaire and behavioural health was evaluated using the parent-reported Strengths and Difficulties Questionnaire (SDQ). Statistical analysis was performed using SPSS version 29, applying chi-square and independent t-tests, with a p-value <0.05 considered statistically significant. Results: The mean age of participants was 13.3 ± 1.7 years, with males comprising 52.2%. Overall, 68.6% of teenagers reported screen time exceeding 2 hours per day. Higher screen time was significantly associated with increasing age and personal smartphone ownership (p<0.05). Teenagers with screen time greater than 2 hours per day had significantly higher SDQ total difficulty scores (16.5 ± 5.0 vs 11.2 ± 4.1; p<0.001). Emotional problems, conduct problems, hyperactivity/inattention, and peer problems were significantly more prevalent in the higher screen time group. Conclusion: Excessive screen time is significantly associated with behavioural health problems among teenagers, highlighting the need for parental regulation and early intervention strategies. Keywords: Screen time, teenagers, behavioural health, SDQ, smartphone use.
Page No: 692-695 | Full Text
Original Research Article
A CLINICAL STUDY OF ACUTE PANCREATITIS, AND EVALUATION OF RANSONS SCORE IN DIAGNOSIS, COMPLICATIONS, AND MANAGEMENT
http://dx.doi.org/10.70034/ijmedph.2026.2.119
Sourabh Prajapat, Dipanshi Shah, Shashank Jain, Sandesh Deolekar, Madhav Lakhotia, Akshata Thakur
View Abstract
Background: Acute pancreatitis is a common gastrointestinal emergency with a wide spectrum of severity, ranging from mild self-limiting disease to severe life-threatening illness. Early prediction of disease severity is crucial for optimizing management and improving outcomes. Ranson’s scoring system is one of the earliest and widely used tools for prognostication; however, its applicability in different populations remains debatable. To study the clinical profile, etiology, and natural history of acute pancreatitis and to evaluate the utility of Ranson’s score in predicting disease severity, complications, and management outcomes. Materials and Methods: This prospective observational study was conducted in the Department of General Surgery at Dr. D. Y. Patil Hospital, Navi Mumbai, over a period of two years (March 2022–March 2024). A total of 50 patients diagnosed with acute pancreatitis were included. Clinical evaluation, laboratory investigations, and imaging studies were performed. Ranson’s score was calculated at admission and after 48 hours. Patients were managed according to standard protocols and followed for outcomes including complications, need for intervention, and mortality. Statistical analysis was performed using SPSS version 25.0, and associations were assessed using appropriate tests. Results: The majority of patients were above 50 years (64%) with a male predominance (58%). Alcohol (46%) and gallstones (44%) were the leading etiological factors. Abdominal pain (96%) was the most common presenting symptom. Based on Ranson’s score, 62% of patients were classified as having severe acute pancreatitis. Most patients (88%) were managed conservatively, and 82% did not develop complications. Among complications, pancreatic pseudocyst (8%) was the most common. No statistically significant association was observed between Ranson’s score and etiology (p = 0.66), management (p = 0.51), or complications (p = 0.14). Conclusion: Ranson’s score, although useful in assessing severity, demonstrated limited predictive value for complications and management outcomes in this study. Conservative management was effective in the majority of cases, including those classified as severe. These findings suggest that Ranson’s score should be used in conjunction with other clinical parameters and scoring systems for better prognostication in acute pancreatitis. Keywords: Acute pancreatitis, Ranson’s score, severity assessment, complications, conservative management.
Page No: 696-702 | Full Text
Original Research Article
UTILITY OF MID-UPPER ARM CIRCUMFERENCE AS A SCREENING TOOL FOR PREDICTING LOW BIRTH WEIGHT IN NEONATES
http://dx.doi.org/10.70034/ijmedph.2026.2.120
Nahid Akhtar, Saksham Srivastava, Singh Bhawana Komal, Kamran Ahmad
View Abstract
Background: Low birth weight (LBW) is associated with high risk of infections, difficulty in breathing, hypothermia and feeding problems. Therefore, LBW should be detected early to allow newborns to receive appropriate care soon after delivery. However, recording of accurate birth weight may not be always possible due to inadequate equipment in resource poor settings especially in rural areas. This study was conducted to determine the efficacy of mid upper arm circumference (MUAC) as a screening tool to identify low birth weight babies where no weighing scales are available. The aim is to study the correlation between mid upper arm circumference and birth weight in neonates in a tertiary care hospital in Lucknow. Materials and Methods: A cross-sectional observational study was conducted at tertiary care center. A total of 100 newborns were included in this study. MUAC and birth weight were measured. Descriptive statistics including mean, standard deviation and range were calculated for MUAC and birth weight. The correlation between MUAC and birth weight was assessed. Results: A total of 100 neonates were assessed within 24 hours of life. Correlation analysis demonstrated that birth weight showed a significant positive association with mid–upper arm circumference (p<0.001). This indicate that lower birth weight is closely linked with lower mid–upper arm circumference. Conclusion: Neonates with low MUAC values consistently fall into lower birth weight categories, whereas high MUAC values were strongly associated with higher birth weights. MUAC showed a very strong positive correlation with birth weight. Keywords: Mid upper arm circumference (MUAC), Low birth weight (LBW), Newborns, Screening tool.
Page No: 703-706 | Full Text
Original Research Article
ASSESSMENT OF STABILITY AND HEALING IN MANDIBULAR CONDYLAR FRACTURES MANAGED WITH LOCKING PLATE SYSTEMS VERSUS CONVENTIONAL MINIPLATES WITH EVALUATION OF BONE HEALING PATTERNS AND PATHOPHYSIOLOGICAL CHANGES
http://dx.doi.org/10.70034/ijmedph.2026.2.121
Noor ul Wahab, Irfan Qureshi, Muhtada Ahmad, Riaz Gul, Aamir Sharif, Fizza Tariq
View Abstract
Background: Mandibular condylar fractures are among the most common fractures of the maxillofacial region and pose significant challenges in achieving optimal functional and anatomical outcomes. Objective: To assess and compare the stability and healing outcomes of mandibular condylar fractures managed with locking plate systems versus conventional miniplates, focusing on bone healing patterns and associated biological responses. Materials and Methods: This comparative observational study included patients with unilateral or bilateral mandibular condylar fractures treated surgically using either locking plate systems or conventional miniplates. Postoperative evaluation was conducted during a defined follow-up period to assess fracture stability, radiographic evidence of bone healing, occlusal alignment, and functional recovery. Healing patterns were analyzed using radiographic imaging, while clinical assessment included pain, mouth opening, and complications such as infection or malocclusion. Results: Patients treated with locking plate systems demonstrated improved initial stability and more uniform bone healing patterns compared to those treated with conventional miniplates. Radiographic evidence indicated faster callus organization and reduced micromovement at the fracture site in the locking plate group. Clinically, better functional outcomes, including improved mouth opening and reduced postoperative discomfort, were observed. Complication rates were comparatively lower in the locking plate group. Conclusion: Locking plate systems provide superior biomechanical stability and promote more favorable bone healing patterns in mandibular condylar fractures compared to conventional miniplates. Their use is associated with improved functional outcomes and reduced postoperative complications, making them a preferable option in appropriately selected cases. Keywords: Mandibular condylar fracture, locking plate system, miniplates, bone healing, fracture stability, maxillofacial trauma.
Page No: 707-711 | Full Text
Original Research Article
CLINICAL AND WOUND RELATED FACTORS AFFECTING SPLIT THICKNESS SKIN GRAFT UPTAKE IN SOFT TISSUE DEFECTS
http://dx.doi.org/10.70034/ijmedph.2026.2.122
Rupal Nanda, Divyashree Singh, Ravi Bilunia
View Abstract
Background: Split thickness skin grafting is commonly used for coverage of ulcers and raw areas where primary closure is not possible. Graft uptake depends on local wound condition, infection status and patient related factors. Identification of predictors of poor graft uptake may help in better preoperative preparation. The aim is to assess the clinical, laboratory and wound related factors affecting split thickness skin graft uptake in patients with soft tissue defects. Materials and Methods: This prospective observational study included 166 patients undergoing split thickness skin grafting for ulcers and raw areas. Demographic profile, wound etiology, wound size, anatomical site, comorbidities, laboratory parameters and preoperative wound culture status were recorded. Graft uptake was assessed clinically on postoperative day 7. Good graft uptake was defined as >=80% uptake and poor graft uptake as <80% uptake. Data were analysed using chi-square test and multivariable logistic regression. Results: Good graft uptake was seen in 125 patients (75.3%), while poor graft uptake was seen in 41 patients (24.7%). Good uptake was seen in 90 patients (81.8%) with wounds <=10 x 10 cm compared with 12 patients (50.0%) with wounds >20 x 10 cm. Diabetic foot ulcers had lower good uptake, seen in 14 patients (46.7%). Culture-positive wounds showed poor uptake in 29 patients (47.5%), while sterile wounds showed poor uptake in only 12 patients (11.4%). On multivariable analysis, culture-positive wound, wound size >20 x 10 cm, serum albumin <3.5 g/dL and diabetes mellitus were independent predictors of poor graft uptake. Conclusion: Poor split thickness skin graft uptake was mainly associated with culture-positive wound, large wound size, hypoalbuminemia and diabetes mellitus. Preoperative correction of infection, nutritional status and glycaemic control may improve graft outcome. Keywords: Split thickness skin graft, graft uptake, wound culture, diabetic foot ulcer, serum albumin, wound size, soft tissue defect
Page No: 712-717 | Full Text
Case Series
PERIPHERAL OSSIFYING FIBROMA: A CASE SERIES
http://dx.doi.org/10.70034/ijmedph.2026.2.123
Shweta Goyal
View Abstract
Background: The objective is to evaluate the clinical presentation and histopathological characteristics of Peripheral Ossifying Fibroma (POF) through a case series, emphasizing its clinical variability and diagnostic features. Materials and Methods: This Descriptive case series was conducted in Department of Pathology at Shri Jagannath Pahadia Medical college Bharatpur Rajasthan from 2023 to 2025. A total of six patients clinically diagnosed with gingival overgrowth and confirmed histopathologically as peripheral ossifying fibroma were included. Clinical records were reviewed for demographic details, lesion characteristics, dimensions and associated symptoms. All lesions were surgically excised and sent for histopathological confirmation. Results: Out of six cases, five (83.3%) were females and one (16.7%) was male, showing a strong female predominance. The age ranged from 11 to 79 years. Lesions were predominantly located on the gingiva, especially in anterior and interdental regions. The duration varied from 2–3 months to 5–6 years. Most lesions were firm, with one case showing bony-hard consistency. Both pedunculated and sessile forms were observed. Conclusion: Peripheral ossifying fibroma is a reactive gingival lesion with diverse clinical presentations but consistent demographic and histopathological features. Early diagnosis and complete surgical excision with elimination of local irritants are essential to prevent recurrence. Keywords: Peripheral ossifying fibroma, gingival lesion, reactive lesion, oral pathology, case series.
Page No: 718-722 | Full Text
Original Research Article
DIAGNOSTIC UTILITY OF FROZEN SECTION IN OPERATIVE SURGICAL SPECIMENS
http://dx.doi.org/10.70034/ijmedph.2026.2.124
Deepa Janghel, Shende Pranali Kisandas
View Abstract
Background: Intraoperative consultation by frozen section technique is an invaluable tool for immediate diagnosis. Its accuracy and limitations vary with different anatomical sites. The correlation of intraoperative frozen section diagnosis with final diagnosis on routine histopathological sections is an integral part of quality assurance in surgical pathology. Materials and Methods: 106 tissue specimens from 64 cases were studied over a period of one year. Diagnostic accuracy of frozen section and its morphological quality and reliability in comparison to histopathology was evaluated for overall morphology. The turnaround time and limitations in section preparation and problems encountered were assessed. Results: Diagnostic accuracy of frozen section was 98.43%. The intraoperative consultation using Frozen section added an addendum to 29.68% cases. Conclusion: Intra-operative frozen section diagnosis is very useful, highly accurate and provides rapid, reliable, cost effective information necessary for optimum patient care. Keywords: Frozen section, Oncopathology, Surgical pathology.
Page No: 723-726 | Full Text
Original Research Article
STUDY TO COMPARE THE EFFICACY OF BREASTFEEDING AND ORAL SUCROSE ON THE PAIN EXPERIENCE OF NEONATES AND INFANTS DURING VENIPUNCTURE USING NEONATAL INFANT PAIN SCALE
http://dx.doi.org/10.70034/ijmedph.2026.2.125
Sandupatla Kiran, M. Suma Priya, Tumukunta Vaishnavi
View Abstract
Background: Aim of the Study: To compare the efficacy of breast feeding and oral sucrose on pain experience of neonates and infants during venipuncture using neonatal infant pain scale. Objectives: 1. To determine and compare the degree of pain in subjects receiving breastfeed and the subjects receiving oral sucrose using neonatal infant pain scale. 2. To determine the association between demographic variables and pain perception. Materials and Methods: It was a Randomized interventional stud carried out at Department of PAEDIATRICS, Malla Reddy Narayana Multi-Specialty Hospital, Suraram, Telangana. During the period 18 months (September 2022 to February 2024). Results: This randomized interventional study was done to compare the efficacy of breastfeeding and oral sucrose on the pain experience of neonates and infants during venipuncture using neonatal infant pain scale. 85 subjects each were allocated to two groups randomly. Group 1 received breastfeeding 10 minutes prior to the procedure and group 2 received 2ml of 24% oral sucrose two minutes prior to the procedure. Comparison of the two groups was based on the efficacy of breastfeeding and oral sucrose for pain experience during venipuncture using neonatal infant pain scale. According age of gestation, in the BF group 35 were pre- terms and 50 were term and in the sucrose group 42 were pre- term and 43 were term babies. The birth weight among the groups were in the breast-feeding group 33 were LBW and 52 had normal birth weight with the mean being 2.79± 0.6. In the sucrose group 31 were LBW and 54 were normal with the mean being 2.74±0.6 kgs. Mean NIPS in breastfeeding subjects was 1.12± 0.8. In the sucrose group mean NIPS being 1.12± 0.8. The overall mean NIPS was 1.33± 0.6. A significant association was seen between the groups suggesting sucrose fed babies had higher pain. In the breast- fed group it is observed the males had a mean NIPS score of 1± 0.8 and in females the mean was 1.1± 0.8 which on comparison was statistically insignificant (p> 0.05). Similarly, sucrose fed group males had a mean NIPS score of 1.5± 0.9 and females the mean was 1.5± 0.8 which on comparison was statistically insignificant. When both the groups were compared a statistically significant (p< 0.05) difference noted between suggesting sucrose- fed group had higher NIPS with respect to gestation. When both the groups were compared a statistically significant (p= 0.04) difference noted between suggesting sucrose- fed group had higher NIPS with respect to normal and low birth weight. Conclusion: According to the findings of the present study, the lowest pain score and crying time was in breastfed neonates. Considering the fact that breast feeding is a natural, useful and free intervention and does not need any special facility, this method is suggested in pain management and control during painful procedures for infants. Keywords: Pain score, NIPS, Oral sucrose, Breast feeding, Venipuncture.
Page No: 727-733 | Full Text
Original Research Article
ASSESSMENT OF ADVERSE DRUG REACTIONS OF CISPLATIN, CARBOPLATIN AND OXALIPLATIN: A PROSPECTIVE OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.126
G. Arathi, Jayashree Ganesan
View Abstract
Background: Platinum-based chemotherapy agents—cisplatin, carboplatin, and oxaliplatin—are widely used in the treatment of solid malignancies but are associated with varying toxicity profiles that may impact treatment outcomes. Materials and Methods: This prospective observational study included 165 adult patients with histologically confirmed cancers treated at a tertiary care hospital in Chennai over one year. Patients received standard platinum-based regimens and were monitored each cycle. Toxicities were assessed using CTCAE criteria, neurotoxicity with EORTC QLQ-CIPN-20, and performance status by WHO scale. Statistical analysis was performed using ANOVA and chi-square tests. Results: Carboplatin showed higher hematological toxicity, including severe anemia, leucopenia, and thrombocytopenia. Cisplatin was associated with increased nephrotoxicity and severe nausea and vomiting. Oxaliplatin demonstrated lower hematological and renal toxicity but higher incidence of peripheral neuropathy and diarrhoea. Differences were statistically significant (p < 0.05). Conclusion: Distinct toxicity profiles exist among platinum agents. Individualized selection can help minimize adverse effects and improve patient outcomes. Keywords: Toxicity profile, Chemotherapy, Nephrotoxicity, Neurotoxicity, Hematological toxicity.
Page No: 734-739 | Full Text
Original Research Article
SCORE BASED SEVERITY ASSESSMENT AND RISK STRATIFICATION OF ACUTE PANCREATITIS IN A TERTIARY CARE CENTRE: AN OBSERVATIONAL, COHORT STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.127
Ritvik D. Jaykar, Santoshkumar N. Deshmukh, Aakanksha A. Giri
View Abstract
Background: Acute pancreatitis outcomes are scored on the basis of bedside index of severity in acute pancreatitis (BISAP), Ranson criteria, acute physiology and chronic health evaluation (APACHE-II) and modified CT severity index (CTSI) score. The utility of these parameters is limited by their result related variability. We studied these scores along with the clinical characteristics of patients admitted to our tertiary care centre with acute pancreatitis. Materials and Methods: Patients with acute pancreatitis were included based on clinical features, pancreatic enzyme levels, and radiology. Data about clinical features, investigations and outcomes (mortality and complications) was obtained. Ranson criteria, BISAP, APACHE-II and modified CTSI scores too were assessed where feasible. Results: Of the 100 patients included, 52% were male with a median age of 51 years. Nearly one-fifths (21%) of the patients had an in-hospital mortality of which 71.43% were females. Common complications were shock (44%), pleural effusion (43%) and acute respiratory distress syndrome (ARDS) (37%). Regression analysis of Ranson criteria at admission showed higher odds of complications in tertile 1 and 2 (crude OR: 1.62, 3.6) even after adjustment for covariates (adjusted OR: 1.63, 4.81). Ranson criteria at 48 hours could predict mortality risk in tertile 2 and 3 (unadjusted OR: 4.17, 3.2; adjusted OR: 5.98, 1.25). Conclusion: The modified Ranson criteria has proven to be useful in predicting the complications based on the admission score and in-hospital mortality based on the cumulative 48-hour score. Other scores like modified CTSI, BISAP and APACHE-II also showed consistent association with poor outcomes. Keywords: Pancreatitis, complications, mortality, organ dysfunction scores, outcome assessment.
Page No: 740-744 | Full Text
Original Research Article
HISTOPATHOLOGICAL VERSUS CYTOLOGICAL EVALUATION OF BONE MARROW: A STUDY ON THE CONCORDANCE BETWEEN ASPIRATION AND TREPHINE BIOPSY
http://dx.doi.org/10.70034/ijmedph.2026.2.128
Trupti Gorte, Keshao Hiwale, Sunita Vagha, Amrita Singh Chauhan
View Abstract
Background: Bone marrow evaluation (BME) is crucial in testing haematological and certain non-haematological conditions. The bone marrow evaluation not only confirm clinically suspected but also previously unsuspected diagnoses. The cytological preparation of bone marrow, which is obtained by aspiration, reveals cellular morphology and allows ancillary testing (flow cytometry, molecular genetics) and the histological sample, usually obtained with a Jamshidi needle, allows optimal evaluation of cellularity, fibrosis or infiltrative disease. Bone marrow aspirate and trephine biopsy specimens nowadays are considered complementary and when both are obtained, they provide a thorough examination of bone marrow. Materials and Methods: During the study period of 1 year 111 BMEs, both bone marrow aspirations and BMB were performed for various indications. Results: The mean age was 40.6 years of these, the Male: Female ratio was 1.3:1. In this study comprising 111 patients, the majority were in the 21–40 years age group (42 cases, 37.8%), followed by the 41–60 years group (28 cases, 25.2%), and the 2–20 years group (24 cases, 21.6%). The least representation was from the 61–80 years age group (17 cases, 15.3%). A slight male predominance was observed overall, with 62 males (55.9%) and 49 females (44.1%). Assessment of bone marrow cellularity revealed that normocellular marrow was most common (42%), followed by hypercellular (34%), hypocellular (19%), and diluted samples (16%). Conclusion: This study reaffirms that while aspiration remains a valuable diagnostic tool, trephine biopsy is indispensable for a comprehensive assessment especially in complex, inconclusive, or architecturally dependent marrow pathologies. For optimal diagnostic accuracy and patient care, the combined use of both BMA and BMB should remain the standard practice. Keywords: Bone Marrow Aspiration, Bone Marrow Biopsy, Leukaemia.
Page No: 745-749 | Full Text
Original Research Article
CLINICAL PROFILE OF ACUTE PANCREATITIS AT TERTIARY CARE HOSPITAL IN NORTH KASHMIR
http://dx.doi.org/10.70034/ijmedph.2026.2.129
Arvind Kumar Gautam, Shair Ahmad Dar, Zaffar Iqbal Kawoosa, Mir Mushtaq, Ifrah Reshi, Sajad, Nowsheen Nazir Parray, Aisha ahmad Dar
View Abstract
Background: Acute pancreatitis is a common and potentially life-threatening inflammatory condition of the pancreas with a wide spectrum of clinical presentations ranging from mild, self-limiting disease to severe systemic illness. The etiology and clinical profile vary across geographic regions. Aims & Objectives: This study aimed to evaluate the clinical characteristics, etiological factors, and outcomes of acute pancreatitis in patients presenting to a tertiary care hospital in North Kashmir. Materials and Methods: This was a hospital-based observational study conducted at a tertiary care center in North Kashmir. Patients diagnosed with acute pancreatitis based on standard diagnostic criteria (clinical features, elevated pancreatic enzymes, and radiological findings) were included. Demographic details, etiological factors, clinical presentation, laboratory parameters, imaging findings, severity assessment, complications, and outcomes were recorded and analyzed using appropriate statistical methods. Results: A total of 200 patients were included in the study. The majority were females (68 %) with a mean age of 3 years. The 31-40 years (35%).The most common etiological factor was biliary cause–(58.5 %), followed by idiopathic(24.5%),Hypertriglyceridemia(4%),drugs(7%),,hyperparathyroidism(2%),Post ERCP (2%),anatomical causes(1%),trauma(1%) & viral causes(1%). The predominant presenting symptom was abdominal pain (100%), often associated with vomiting and abdominal tenderness. Based on severity classification, (47.5%) had mild, (32.5%) had moderately severe, and (20%) had severe pancreatitis. The most prevalent complication was hypocalcemia (27%), followed by ascites and kidney injury (9% each), pleural effusion (8%), and organ failure (4%). The overall mortality rate was (6%), with higher mortality associated with severe disease and organ dysfunction. Conclusion: Acute pancreatitis in this region shows distinct etiological and clinical patterns, with gallstones/alcohol being the leading causes. Early diagnosis, severity stratification, and timely management are crucial in reducing morbidity and mortality. Regional data such as this are essential for improving clinical outcomes and guiding preventive strategies. Keywords: Acute pancreatitis; Clinical profile; Etiology; Severity; Complications; North Kashmir; Tertiary care; Outcome.
Page No: 750-756 | Full Text
Original Research Article
PREDICTORS OF OUTCOME IN ACUTE SUBDURAL HEMATOMA IN A RESOURCE-LIMITED SETTING: A RETROSPECTIVE OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.130
Arvind Kumar Bodda, KVV Satyanarayana, K Chandrashekhar Reddy
View Abstract
Background: Acute Subdural Hematoma is one of the critical brain inuries, which results in a very high mortality rate. Its outcome depends on variety of factors like age, comorbidities, medication patient is on, mechanism of injury, time of presentation, time of surgical evacuation or start of conservative measures. With Current timely interventions though the mortality has come down, its still a long way to go. The primary objective is to study factors responsible for possible neurological deterioration in a case of Acute Subdural Hematoma. Secondary objective is to Study the correctness of this prediction in the course of treatment in the hospital. Materials and Methods: A prospective study conducted at a tertiary care hospital, enrolling all the patients with Acute Subdural Hematoma. Detailed evaluations, including history, clinical and radiological examinations, performed to determine a possible SDH Risk score, to categorize patients into Definitely requiring surgery, borderline and low risk. Borderline patients were divided into surgical and non-surgical group, Neurological deterioration in Non surgical group was looked for. Results: 80 patients with Acute Subdural Hematoma meeting the inclusion and exclusion criteria were enrolled. Patients with high-risk score were surgically evacuated. Patients with Borderline score were given the options of continuing the conservative line of management or to go for surgical evacuation after fully explaining the risk of conservative management and the risks of surgery. Outcome in both Surgical and conservative group of Borderline cases was closely followed for a period of 14 days. 5 patients were falling onto borderline category. 38 patients underwent surgery, 42 patients treated conservatively. 5 patients who were in conservative management under borderline category showed neurological deterioration and eventually underwent surgery. Conclusion: Acute subdural hematoma is leading cause of mortality among traumatic brain injury. SDH Risk score stands as a good tool in suggesting best line of management. With due timely intervention Borderline cases can also be saved with Surgical Evacuation. However, larger sample size is needed to expand the findings to all traumatic acute subdural hematoma patients. Keywords: Brain injury, trauma, Acute subdural hematoma
Page No: 757-759 | Full Text
Original Research Article
A CROSS SECTIONAL STUDY ON BREASTFEEDING PRACTICES AMONG POSTNATAL WOMEN IN FIELD PRACTICE AREA OF TERTIARY HEALTH CARE CENTRE, HYDERABAD, TELANGANA
http://dx.doi.org/10.70034/ijmedph.2026.2.131
Bhavana Laxmi Surity, Singirikonda Srikanth, Aruna Ch, Arundhathi Baki
View Abstract
Background: Breast feeding is a nature’s way of nurturing the child, creating a strong bond between the mother and the child. Breast milk is the first best ideal natural food, easily digested and absorbed by the infant as compared to formula milk. Under normal conditions, Indian mothers secrete 450-600 ml of milk daily with 1.1 gram protein and 70 kcal of energy per 100ml of human milk1. Breast milk is used as the 'gold' standard for good infant nutrition at birth. The aim is to study breastfeeding practices among postnatal mothers in urban and rural field practice area of tertiary health care centre, Hyderabad, Telangana. The objective is to study the prevalence of exclusive breastfeeding practice among the study subjects. To study the socio demographic factors influencing breastfeeding practices among study subjects. To compare the breastfeeding practices of study subjects in urban and rural field practice areas. Materials and Methods: A community based cross sectional study was conducted among N=520 postnatal mothers in urban and rural filed practice area of Department of Community Medicine, Osmania Medical College. Postnatal mothers of infants, were taken into the study after attaining the consent. Simple random sample technique was followed in selecting the study population. A pretested and semi structured questionnaire was used for collecting data on breastfeeding practices by interviewing mothers of infants. Inclusion criteria is Postnatal mothers of infants aged 0-12 months of both sexes who had given consent for the study. Exclusion criteria – Postnatal mother of infants with congenital birth defects. Infants in whom breastfeeding is contraindicated like galactosemia, psychosis of mother etc. Postnatal mothers who did not give informed consent. Results: In the present study majority 45.5% (237) of the infants belonged to 6-9 months age group among the total study population. In the present study the prevalence of early initiation of breastfeeding within 1 hour after child birth is 45%(117) in urban area and 43.5%(113) in rural areas and 44.2%(230) among total study population. Prevalence of exclusive breast feeding among infants 0 – 12 months age is 71.9% (187 out of 260) in urban areas and 74.23% (193 out of 260) in rural areas. Conclusion: Antenatal care counselling (ANC) during pregnancy in encouraging proper breastfeeding practices among mothers should be strengthened and should be provided as a continuum of care, by appropriately trained health-care professionals and community-based lay and peer breastfeeding counsellors. Counselling should anticipate and address important challenges and contexts for breastfeeding, in addition to establishing skills, competencies and confidence among mothers. Keywords: Exclusive Breast Feeding, Antenatal Mother, Infant.
Page No: 760-766 | Full Text
Original Research Article
KNOWLEDGE, ATTITUDE AND PERCEPTION REGARDING SCHIZOPHRENIA AMONG UNDERGRADUATE STUDENTS: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.16.2.132
Abhishek Dhawan, Yashaswi Ninawe, Adchitre SA, Akshita Sharma, Vishal Pahune, Andrea Almeida
View Abstract
Background: Schizophrenia is a chronic psychiatric disorder associated with significant stigma, social exclusion, and poor health-seeking behavior. Understanding awareness and attitudes among young adults is essential for early intervention and stigma reduction. Aim: 1. To assess knowledge, attitude, and perception regarding schizophrenia among undergraduates. Objectives: 1. To assess the level of knowledge regarding schizophrenia among undergraduate students, including its nature, causes, clinical features, and course of illness. 2. To evaluate attitudes of undergraduate students toward individuals with schizophrenia, particularly in social, familial, and occupational contexts. 3.To explore perceptions and beliefs related to stigma, chronicity, treatability, and functional impact of schizophrenia among undergraduate students. 4. To determine the association between socio-demographic factors (such as gender and age) and knowledge, attitude, and perception regarding schizophrenia. Materials and Methods: A cross-sectional study was conducted among 112 undergraduate students using a pre-validated, self-administered questionnaire. Data were analysed using proportions and percentages. Chi-square test was applied to assess gender-wise differences. A p-value <0.05 was considered statistically significant. Results: Majority (96.4%) identified schizophrenia as a psychiatric illness. Females demonstrated a more positive attitude towards employment and social inclusion. Misconceptions regarding dangerousness and heredity persisted. Gender difference was statistically significant for employment-related attitude (χ² = 4.12, p = 0.04). Conclusion: Despite adequate baseline knowledge, stigma and uncertainty persist. Structured mental health education at undergraduate level is strongly recommended. Keywords: Schizophrenia, Knowledge, Attitude, Undergraduate students, Mental health literacy.
Page No: 767-775 | Full Text
Original Research Article
EFFECT OF HEALTH EDUCATION INTERVENTION ON AVERAGE DAILY STEP COUNT AMONG NURSING OFFICERS IN A GOVERNMENT MEDICAL COLLEGE & HOSPITAL OF HARYANA
http://dx.doi.org/10.70034/ijmedph.2026.2.133
Bharti, Sumit Chawla, Bharti, Manju Rani, Muskan Saini, Leena Yadav
View Abstract
Background: Physical inactivity is a major modifiable risk factor for non-communicable diseases such as cardiovascular disease, diabetes mellitus, obesity, and certain cancers. Health education interventions have been recognized as an effective strategy to promote physical activity by increasing knowledge, motivation, and self-efficacy. Interventions focusing on simple, achievable goals such as increasing daily step count are particularly suitable for busy healthcare workers. Materials and Methods: This is a Quasi-experimental, before–after interventional study conducted over a period of four months from August 2024 to November 2024. A total of 112 nurses were included in the study. Appropriate statistical tests were used for calculation of various parameters. Results & Conclusion: In conclusion, the telephonic health education intervention resulted in a statistically significant increase in average daily step count among nursing officers. Given the sedentary baseline activity levels observed, structured behavioral reinforcement through periodic telephonic contact may serve as a feasible and cost-effective workplace strategy to promote physical activity among healthcare professionals with 3030.00 ± 1424.51 steps before intervention vs 3360 ± 1393.40 after intervention. Keywords: Health education, Physical inactivity, Steps.
Page No: 776-781 | Full Text
Original Research Article
MORBIDITY PATTERN, HEALTH SEEKING BEHAVIOUR AND ITS RELATED FACTORS AMONG THE ELDERLY URBAN DWELLERS OF NAGAON TOWN, ASSAM, INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.134
Hiyeswar Borah, Nibir Nath Sarma, Kalyan Borah, Jahnabi Das, Simmy Gavel
View Abstract
Background: Context/Background: With India projected to have nearly 20% of its population in the geriatric age group, the changing demographic structure unders cores the crucial role of older adults in nation-building through community engagement, knowledge dissemination, and shared experiences. Ensuring graceful ageing requires health-promotional activities like recreational activities, physical activity, and social contact, etc. Comprehensive geriatric assessment and coordinated screening programs are necessary for an independent life in later stages. This study aims to assess the prevalence of morbidity patterns and their related factors and health-seeking behaviour of elderly people, while also fostering awareness about health-promotional activities. Materials and Methods: A community-based cross-sectional study was conducted from May to October among elderly (≥60) people in urban areas of Nagaon town, Assam. A sample size of 384 was taken from 9 wards, selected randomly from 28 registered wards. Data was collected using a pre-designed, pre-tested and semi-modified proforma by interviewing the elderly after taking informed consent. Ethical clearance was obtained from the Institutional Ethical Committee. Elderly not willing to participate and giving incomplete information were excluded. Results: Total 54.7% males and 45.3% females were included in the study. Majority (64.6%) were from 60–74 years age group. Overweight was found 42.9% and 16.6% were obese. Hypertension was commonest (63.8%) among the elderly, followed by DM (37.5%).About 16.4% elderly had at least two co-morbidities. Hypertension among the people above 85 years was more prevalent (80%) than the other age groups. Conclusion: Hypertension and diabetes were the commonest morbidity among the participants. Overweight and obesity were also common. In our study, most of the elderly lacked physical activity. This study implies that there is a need to create awareness among the general public highlighting the importance of physical activity for healthy ageing. Keywords: Elderly, Physical activity, Morbidity, Health seeking behaviour, Urban dwellers.
Page No: 782-786 | Full Text
Original Research Article
COMPARISON OF CONTINUOUS EPIDURAL ANALGESIA VERSUS CONTINUOUS FASCIA ILIACA BLOCK FOR POST OPERATIVE ANALGESIA IN PATIENTS UNDERGOING HIP SURGERIES: A RANDOMISED CONTROLLED STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.135
V S V Bhavya Krishna Prasad, K. Mohan, V. Prem Kumar
View Abstract
Background: Effective postoperative pain management is essential following major hip surgeries to facilitate early mobilisation, reduce complications, and improve functional recovery. While continuous epidural analgesia has long been considered the gold standard, it is associated with haemodynamic instability. The fascia iliaca compartment block (FICB) has emerged as a promising alternative, offering effective analgesia with potentially fewer systemic effects. This study compares the efficacy and safety of continuous epidural analgesia and continuous fascia iliaca block in patients undergoing hip surgeries. Materials and Methods: This prospective randomized controlled study was conducted over 18 months and included 90 patients undergoing elective hip surgeries. Patients were randomly allocated into two groups: Group E (continuous epidural analgesia) and Group F (continuous suprainguinal FICB), with 45 patients in each group. All patients received standardized spinal anaesthesia intraoperatively. Postoperatively, continuous analgesia was maintained via epidural catheter in Group E and ultrasound-guided fascia iliaca catheter in Group F. Pain was assessed using the Visual Analogue Scale (VAS) at predefined intervals. Haemodynamic parameters, rescue analgesic requirement, and adverse effects were recorded. Statistical analysis was performed using independent t-test and Chi-square test, with p <0.05 considered significant. Results: Baseline characteristics were comparable between the two groups (p >0.05). VAS scores were significantly lower in the epidural group from 1 hour to 24 hours postoperatively (p <0.001), indicating superior analgesic efficacy. Rescue analgesic requirement was significantly reduced in Group E (820 ± 215 mg) compared to Group F (1320 ± 260 mg) (p <0.001). Mean arterial pressure was significantly lower in the epidural group throughout the postoperative period (p <0.001), with a higher incidence of hypotension (20.0% vs 4.4%, p = 0.03). Heart rate, oxygen saturation, and respiratory rate were comparable between groups. No cases of respiratory depression were observed. Conclusion: Continuous epidural analgesia provides superior and sustained postoperative analgesia with reduced analgesic requirements but is associated with increased risk of hypotension. Continuous fascia iliaca block offers better haemodynamic stability with adequate pain control, making it a valuable alternative, particularly in patients where haemodynamic stability is a concern. Keywords: Epidural analgesia; Fascia iliaca compartment block; Hip surgery; Postoperative pain; Visual Analogue Scale.
Page No: 787-790 | Full Text
Original Research Article
PARENTAL AWARENESS, KNOWLEDGE AND PREVALENCE OF REFRACTIVE ERRORS AMONG CHILDREN OF HOSPITAL PERSONNEL IN A TERTIARY CARE CENTRE
http://dx.doi.org/10.70034/ijmedph.2026.2.136
Suman M G, Sahana A, Zoya Mohsin, Arjuman Sadaf A, Govindaraj M
View Abstract
Background: Refractive errors are the leading cause of preventable visual impairment in children. Early detection and timely correction are essential to prevent long-term visual disability. This study aimed to determine the awareness about refractive errors among children of hospital personnel along with, associated risk factors, at a tertiary care centre. Materials and Methods: A hospital-based cross-sectional study was conducted from May to July 2025 among 100 children aged 6 months to 18 years. Comprehensive ocular examination including cycloplegic refraction was performed. Parental awareness on the need for screening, age of first evaluation, parental refractive status, screen time, were recorded. Data was analysed using SPSS version 22, and associations were tested using Chi-square or Fisher’s exact test. Results: Parental awareness was significantly higher among doctors compared to paramedical staff (p < 0.001). Among children diagnosed with amblyopia, 75% of parents were unaware of the condition.The overall prevalence of refractive error was 47%, with myopia (34%) and astigmatism (33%) being the most common types. No significant association was observed with age or gender. A significant association was found between parental refractive error and refractive error in children (p = 0.003). Screen time was significantly associated with overall refractive error (p = 0.006), though not specifically with myopia. Conclusion: Despite healthcare proximity, gaps in parental awareness remain, particularly among non-medical staff. Regular screening and targeted educational interventions are essential for early detection and prevention of avoidable visual impairment. Parental refractive status and near-work exposure are important risk factors for childhood refractive errors. Keywords: early detection, amblyopia, screening, astigmatism, screen time.
Page No: 791-797 | Full Text
Original Research Article
IMPACT OF MATERNAL OBESITY ON FETOMATERNAL OUTCOMES IN PREGNANCY
http://dx.doi.org/10.70034/ijmedph.2026.2.137
Momy Gul, Madhu Bala, Asma Kashif, Musrat Mustafa Shah, Farzana Gul, Irum Naz
View Abstract
Background: Objective: To determine the frequency of maternal and fetal complications in pregnancies complicated by maternal obesity, to evaluate its association with adverse maternal outcomes such as gestational diabetes mellitus, hypertensive disorders, and cesarean delivery, and to assess fetal outcomes including macrosomia, preterm birth, and neonatal complications, while also analyzing the relationship between different body mass index categories and pregnancy outcomes, and its impact on the mode of delivery and overall perinatal outcomes. Duration and Place of Study: This study was conducted at Bolan Medical College Quetta from February 2025 to February 2026. Materials and Methods: A cross-sectional study was conducted on 200 pregnant women with BMI >25 kg/m², gestational age ≥24 weeks, aged 20–35 years, and singleton pregnancies. Women with pre-existing chronic diseases, hypertension, pre-gestational diabetes, infections, or multiple gestations were excluded. Data on demographics, obstetric history, BMI, and clinical findings were collected using a structured proforma. Maternal outcomes included gestational hypertension, gestational diabetes, mode of delivery, and intrapartum complications, while fetal outcomes included macrosomia, stillbirth, and neonatal complications. Data were analyzed using SPSS v26, with continuous variables presented as mean ± SD, categorical variables as percentages, and chi-square tests applied for significance (p<0.05). Results: The study found several fetomaternal complications among pregnant women with obesity. Gestational hypertension was observed in 94 (47%) participants, and gestational diabetes in 80 (40%). Assisted births occurred in 80 (40%) cases, while cesarean sections were performed in 162 (81%). Fetal macrosomia was noted in 54 (27%) cases, and stillbirth in 16 (8%). Multiparous women had higher rates of assisted births compared to primiparous women. Gestational hypertension, cesarean delivery, and stillbirth were more frequent in women with a BMI of above 30 kg/m². compared to those with BMI of 25.6–30 kg/m² Conclusion: Maternal obesity is strongly associated with adverse pregnancy and birth outcomes. Higher BMI increases the risk of maternal and fetal complications, obstetric interventions, and fetal macrosomia. Careful clinical assessment and antenatal counseling are recommended, and overweight and obese women should be considered high-risk. Keywords: Maternal obesity, Fetomaternal complications, Maternal complications, BMI, Obesity, Hypertension.
Page No: 798-802 | Full Text
Original Research Article
GLANDIN E2 GEL VS GLANDIN E2 TABLET FOR INDUCTION OF LABOR AND SUCCESSFUL DELIVERY
http://dx.doi.org/10.70034/ijmedph.2026.2.138
Sanobar Baloch, Kanta Bai Ahuja, Azra, Shahneela Moosa, Saba Ijaz, Ghazala Arshad
View Abstract
Background: Induction of labour is one of the most frequently performed obstetric interventions worldwide, accounting for nearly 20% of deliveries. Vaginal dinoprostone (prostaglandin E2) is widely used for cervical ripening and labour induction. Different formulations, including gel and pessary (tablet), are available; however, comparative effectiveness remains a subject of clinical interest. Objective: To compare the efficacy and safety of Glandin E2 gel versus Glandin E2 tablet (pessary) for induction of labour and achievement of successful vaginal delivery. Study design: A randomised controlled study. Duration and place of study: This study was conducted at Indus Medical College Tando Muhammad Khan from January 2025 to January 2026 Materials and Methods: This randomised controlled trial was conducted in the Department of Obstetrics and Gynaecology. A total of 120 pregnant women with singleton, live, cephalic presentation pregnancies at ≥37 weeks of gestation with intact membranes were enrolled. Participants were randomly allocated into two equal groups (n = 60 each). Group A received Glandin E2 tablet (dinoprostone pessary), while Group B received Glandin E2 gel, both administered intravaginally in the posterior fornix. Data were analysed using SPSS version 20. Maternal age and gravidity were recorded, along with induction outcomes Results: There were a total of 120 females included in this study. All the participants of this study were divided into 2 groups having an equal number of patients in each group (n=60). The mean age calculated for group A was 27.8 years while for group B, it was 26.4 years. The majority of the females were primigravida in both the groups. Conclusion: Glandin E2 gel appears to be more effective than Glandin E2 tablet in achieving successful induction of labour and vaginal delivery Keywords: Labour induction; Dinoprostone; Prostaglandin E2; Cervical ripening; Glandin E2 gel; Glandin E2 pessary; Randomised controlled trial.
Page No: 803-806 | Full Text
Original Research Article
THE INTERPREGNANCY INTERVAL AS A DETERMINANT OF ADVERSE OBSTETRIC OUTCOMES
http://dx.doi.org/10.70034/ijmedph.2026.2.139
Soniya Mehtab, Kanwal Atif, Madhu Bala, Samina Bugti, Banadi, Nosheen Mushtaq
View Abstract
Background: The objective is to determine the relationship between interpregnancy interval (IPI) and obstetric outcomes in women with multiple pregnancies and whether spacing between pregnancies has a role in negative maternal and perinatal outcomes. This study was conducted at Liaquat University of Medical and Health Sciences Jamshoro from January 2025 to January 2026. Materials and Methods: This prospective observational study was carried out among 200 of the pregnant women who were booked with parity of 1-4. The participants were divided into two groups depending on the interpregnancy interval, which included less than 18 months and 18-24 months. Women who had chronic medical conditions, had multiple pregnancies or grand multiparity were excluded. Each of the participants was observed during pregnancy until delivery, and the maternal and the newborn outcomes were measured. The standard tests of significance were conducted to conduct statistical analysis and p less than 0.05 was regarded as significant. Results: Majority (112; 56%-women) of the women were aged between 20-30 years and 118 (59.0%-women) were in the lower socioeconomic levels. Caesarean birth was the commonest means of birth with 148(74%) women. Out of the neonatal outcomes, 52(26%) of the infants were found to be born with low birth weight whereas 30(15%) were preterm. The maternal complications were 38(19%) and 17 (8.5) and 14 (7) represented anemia, gestational diabetes, and hypertensive disorders respectively. It was found that there is a significant correlation between short IPI (<18 months) and the adverse outcomes of anemia (p = 0.001), preterm birth (p = 0.000), and low birth weight (p = 0.000). None of the sociodemographic characteristics were found to be statistically significant to IPI. Conclusion: Very short periods between pregnancies especially less than 18 months are closely linked to the risk of maternal anemia, preterm birth and low babies. Postpartum counseling and easy access to family-planning services can be very useful in encouraging optimal birth spacing of 18- 24 months, which in turn can lead to significant improvement in the health outcomes of both mother and child. Keywords: Interpregnancy interval, Obstetric outcome, Maternal morbidity, Neonatal outcome, Birth spacing.
Page No: 807-812 | Full Text
Original Research Article
PULMONARY DYSFUNCTION IN TYPE 2 DIABETES MELLITUS: INTEGRATED ANALYSIS OF GLYCEMIC BURDEN, ADIPOSITY, AND SPIROMETRIC IMPAIRMENT
http://dx.doi.org/10.70034/ijmedph.2026.2.140
Mahboob Fatima Mohd Sirajuddin Ahmed Siddiqi, Aamir Naushad, Anand N. Badwe
View Abstract
Background: Type 2 Diabetes Mellitus (T2DM) is a chronic metabolic disorder associated with multiple systemic complications. Emerging evidence suggests that the lung may represent an additional target organ; however, pulmonary involvement remains under-recognized. The objective is to evaluate pulmonary function in patients with Type 2 Diabetes Mellitus and to assess its association with glycemic parameters and anthropometric indices. Materials and Methods: This analytical case–control study included 200 participants comprising 100 patients with T2DM and 100 age- and sex-matched healthy controls. Anthropometric parameters including body mass index (BMI), waist circumference, hip circumference, and waist–hip ratio (WHR) were recorded. Glycemic status was assessed using fasting blood sugar (FBS) and postprandial blood sugar (PPBS). Pulmonary function tests were performed using spirometry, measuring FVC, FEV₁, FEV₁/FVC ratio, FEF 25–75%, and PEF. Statistical analysis was conducted using Z-test and Pearson’s correlation. Results: Patients with T2DM exhibited significantly higher BMI, waist circumference, hip circumference, and WHR compared to controls (p < 0.0001). Glycemic parameters (FBS and PPBS) were markedly elevated in the diabetic group (p < 0.0001). Pulmonary function tests revealed significant reductions in FVC and FEV₁ in diabetic patients (p < 0.0001), with a predominantly restrictive pattern of lung impairment observed. Significant negative correlations were found between glycemic parameters and pulmonary function indices, indicating progressive decline in lung function with increasing glycemic burden. Conclusion: Type 2 Diabetes Mellitus is associated with significant impairment in pulmonary function, predominantly of a restrictive nature. Chronic hyperglycemia and increased adiposity appear to play a central role in this dysfunction. These findings support the inclusion of pulmonary function testing in the routine assessment of patients with T2DM. Keywords: Type 2 Diabetes Mellitus; Pulmonary Function; Spirometry; Glycemic Burden; Adiposity; Restrictive Lung Disease; Forced Vital Capacity; Insulin Resistance.
Page No: 813-820 | Full Text
Original Research Article
PREVALENCE & PREDICTORS OF MDR TB AMONG PULMONARY TB PATIENTS IN AN ASPIRATIONAL TRIBAL DISTRICT: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.141
Mittal C. Balat, Nehal L. Damor
View Abstract
Background: Multidrug-resistant tuberculosis (MDR-TB) poses a significant challenge to tuberculosis control, particularly in vulnerable and underserved populations. Tribal communities residing in aspirational districts often experience poor healthcare access, malnutrition, and treatment non-adherence, increasing the risk of drug resistance. The objective is to determine the prevalence and predictors of MDR-TB among pulmonary TB patients in an aspirational tribal district. Materials and Methods: Cross-sectional observational study was conducted among 100 pulmonary TB patients registered under the National TB Elimination Programme in an aspirational tribal district. Socio-demographic, clinical, and treatment-related variables were collected using a structured proforma. Drug susceptibility testing was performed using CBNAAT and Line Probe Assay. Statistical analysis was conducted using SPSS. Associations were assessed using Chi-square test and independent t-test, and odds ratios were calculated with 95% confidence intervals. Results: The prevalence of MDR-TB was 18% (95% CI: 11.5-26.7%). Tribal status (p = 0.019), HIV positivity (p = 0.012), and low BMI (p = 0.015) were significantly associated with MDR-TB. Treatment-related factors such as previous TB treatment (p = 0.003), treatment default (p = 0.002), irregular drug intake (p = 0.001), and substance abuse (p = 0.014) were strong predictors of drug resistance. Age and diabetes mellitus were not significantly associated with MDR-TB. Conclusion: The study revealed a substantial burden of MDR-TB in an aspirational tribal district, with treatment-related non-adherence emerging as the strongest predictor. Strengthened adherence monitoring, early molecular diagnosis, nutritional support, and targeted interventions in tribal communities are essential to reduce the burden of MDR-TB and achieve TB elimination goals. Keywords: Multidrug-resistant tuberculosis; Tribal population; Drug resistance predictors.
Page No: 821-825 | Full Text
Original Research Article
FUTURE OF MONOCLONAL GAMMOPATHY DIAGNOSIS: MASS SPECTROMETRY, A TRANSFORMATION IN PERSPECTIVE BEYOND ELECTROPHORESIS
http://dx.doi.org/10.70034/ijmedph.2026.2.142
E. Maruthi Prasad, Rajesh Bendre
View Abstract
Immunofixation electrophoresis (IFE) and serum protein electrophoresis (SPEP) have been the major diagnostic tools for monoclonal gammopathies (MGs) in diagnostic laboratory settings for over five decades. These techniques are necessary but show limitations such as resolution, subjectivity, and sensitivity. Recently, clinical laboratory settings are increasingly applying high-resolution technologies like mass spectrometry (MS) and next-generation sequencing (NGS). This article discusses the historic role and operational limitations of SPEP and IFE with analytical challenges in diagnostic settings for the first time. We also discussed ideas about backup plans, such as MS-based immunoglobulin profiling and a standard method for the detection and quantification of MS-based monoclonal proteins. Additionally, analyzing the B-cell repertoire through NGS provides an opportunity to examine the genome for clonal detection. The integration of both biochemical and genomic data with the biomarkers, such as abnormal glycosylation patterns and ratios of heavy/light chains, to evaluate the risk of myelomas. A framework for early intervention, precise surveillance, and individualized treatment approaches in myeloma management is urgently needed. This transformation affects labs in terms of operations and plans for the future and brings a new era for early disease diagnosis, tracing minimal residual disease, and creating personalized predictions about how maladies like Monoclonal Gammopathy of Undetermined Significance (MGUS) will evolve in their disease progression. Keywords: Monoclonal gammopathy, mass spectrometry, serum protein electrophoresis, immunofixation electrophoresis, Next-generation sequencing, multiple myeloma, MGUS, proteomics, clonality.
Page No: 826-834 | Full Text
Review Article
PRESCRIPTION WRITING SKILLS IN FRESH MEDICAL GRADUATES
http://dx.doi.org/10.70034/ijmedph.2026.2.143
Ramesh Lolla, Shilpa Karkera, Yoga Rajamani, Kapili Dada
View Abstract
Prescription writing is one of the most important clinical responsibilities entrusted to fresh medical graduates and serves as a key indicator of their readiness for independent practice. It is not merely a technical act of writing the name of a medicine, but a complex professional skill that integrates pharmacological knowledge, clinical reasoning, ethical responsibility, legal awareness, communication, and patient safety. This review examined the concept, scope, and educational significance of prescription writing skills in fresh medical graduates, with particular attention to the transition from undergraduate learning to real-world prescribing. The review highlights that new graduates frequently encounter difficulties in converting theoretical knowledge into safe and rational prescribing decisions in actual clinical settings. Common concerns include incomplete prescriptions, unclear instructions, inadequate dose individualization, weak patient counselling, and insufficient safety checks. These problems are influenced not only by individual inexperience, but also by limitations in undergraduate training, lack of repeated practical exposure, and the pressures of busy clinical environments. The review further emphasizes that good prescription writing is grounded in regulatory compliance, ethical prescribing, completeness of documentation, rational drug selection, and sound clinical reasoning before the order is written. It also underlines the importance of communication and patient-centredness, as a prescription must be understandable and usable for patients, pharmacists, nurses, and other healthcare professionals. Keywords: Prescription writing, fresh medical graduates, rational prescribing, patient safety, prescribing education.
Page No: 835-844 | Full Text
Original Research Article
INTEGRATED TEACHING IN MEDICAL COLLEGES: ADVANTAGES AND LIMITATIONS
http://dx.doi.org/10.70034/ijmedph.2026.2.144
Ramesh Lolla, Shilpa Karkera, Hannah Chung, David Dunn
View Abstract
Integrated teaching has become an important reform in undergraduate medical education because it addresses the long-standing problem of fragmented learning in traditional discipline-based curricula. In conventional teaching, basic sciences and clinical subjects are often delivered separately, making it difficult for students to connect foundational knowledge with patient care. Integrated teaching attempts to overcome this gap by linking related disciplines, clinical relevance, professional competencies, and emerging healthcare themes within a more coherent curricular structure. This review examines the concept of integrated teaching in medical colleges, with particular focus on its major advantages and limitations. The review highlights that integrated teaching supports meaningful learning by promoting conceptual linkage between subjects, strengthening horizontal and vertical integration, and improving the relevance of preclinical knowledge through early clinical exposure. It also broadens the educational scope of the curriculum by incorporating patient safety, systems thinking, professionalism, digital health, telehealth, simulation, leadership, nutrition, research literacy, humanities, and future-oriented healthcare needs. Such an approach encourages the development of clinical reasoning, reflective capacity, communication, teamwork, and professional identity in a more connected and learner-centred manner. At the same time, integrated teaching presents several practical and academic challenges. These include the need for clear curricular models, faculty development, strong interdepartmental coordination, resource support, assessment alignment, and protection against superficial or tokenistic implementation. Inadequate planning may lead to confusion, loss of disciplinary depth, curricular overload, and weak educational outcomes. Keywords: Integrated teaching; medical education; curriculum integration; undergraduate medical curriculum; competency-based education.
Page No: 845-854 | Full Text
Original Research Article
A COMPARATIVE STUDY OF DESFLURANE VERSUS SEVOFLURANE IN OBESE PATIENTS UNDERGOING LAPAROSCOPY SURGERY: EFFECT ON RECOVERY PROFILE
http://dx.doi.org/10.70034/ijmedph.2026.2.145
Akanksha Jain, Anchal Tiwari, Gaurita Shrivastava
View Abstract
Background: Obesity presents significant anesthetic challenges due to altered respiratory mechanics, pharmacokinetics, and increased perioperative risks. Volatile anesthetics such as desflurane and sevoflurane are commonly used due to their low blood–gas solubility, but their comparative effects on recovery profiles in obese patients remain an area of clinical interest. Aim: To compare desflurane and sevoflurane with respect to recovery characteristics, cognitive function, hemodynamic stability, and postoperative adverse effects in obese patients undergoing laparoscopic surgery. Materials and Methods: This prospective, randomized, double-blind study included 80 obese patients (BMI ≥30 kg/m²), aged 18–60 years, undergoing elective laparoscopic surgery. Patients were allocated into two groups: desflurane (Group D) and sevoflurane (Group S). Standardized anesthesia protocols were followed. Primary outcomes included recovery parameters (eye opening, extubation, orientation), Modified Aldrete Score, and Mini-Mental State Examination (MMSE). Secondary outcomes included intraoperative hemodynamics, adverse events, and post-anesthesia care unit (PACU) stay duration. Results: Baseline characteristics were comparable between groups. Desflurane demonstrated significantly faster recovery, with shorter times to eye opening (4.6 ± 1.4 vs 7.5 ± 1.5 min), verbal response (5.2 ± 1.3 vs 8.1 ± 1.4 min), and extubation (6.3 ± 1.1 vs 9.2 ± 1.6 min) (p < 0.001). Cognitive recovery was also faster, with earlier return to baseline MMSE (39.6 ± 12.8 vs 51.1 ± 13.5 min; p = 0.001). Modified Aldrete Scores were significantly higher in the desflurane group at all intervals, indicating quicker recovery. Hemodynamic parameters and respiratory variables remained stable and comparable in both groups. The incidence of adverse effects was low and statistically insignificant. PACU discharge readiness was achieved earlier with desflurane (45.8 ± 7.5 vs 58.2 ± 8.4 min; p < 0.001). Conclusion: Desflurane provides significantly faster emergence, improved early cognitive recovery, and reduced PACU stay compared to sevoflurane in obese patients undergoing laparoscopic surgery, without compromising hemodynamic stability or safety. Keywords: Desflurane, Sevoflurane, Obesity, Laparoscopic surgery, Recovery profile, MMSE, PACU.
Page No: 855-863 | Full Text
Original Research Article
PRESSURE-CONTROLLED VERSUS VOLUME-CONTROLLED VENTILATION IN ROBOT-ASSISTED PELVIC SURGERIES: A COMPARATIVE ANALYSIS OF RESPIRATORY MECHANICS AND HEMODYNAMIC EFFECTS
http://dx.doi.org/10.70034/ijmedph.2026.16.2.146
Praveen, Henjarappa K S, Mamatha H S, Arathi B H, V. B. Gowda, Srihari S S
View Abstract
Background: Robot-assisted pelvic surgeries require pneumoperitoneum and steep Trendelenburg positioning, which significantly affect respiratory mechanics and cardiovascular physiology. Selection of an optimal ventilation strategy is crucial to minimize pulmonary complications and maintain hemodynamic stability. Pressure-controlled ventilation and volume-controlled ventilation are commonly used modes; however, their comparative intraoperative effects remain inadequately defined in robotic surgical settings. Objectives: To compare the effects of pressure-controlled ventilation and volume-controlled ventilation on respiratory mechanics and hemodynamic parameters in patients undergoing robot-assisted pelvic surgeries. Materials and Methods: This prospective comparative study included 88 adult patients undergoing elective robot-assisted pelvic surgeries, who were randomly allocated into two groups: PCV group (n = 44) and VCV group (n = 44). Standardized anesthetic protocols were followed. Respiratory parameters including peak airway pressure, dynamic compliance, tidal volume, minute ventilation, end-tidal carbon dioxide, and oxygen saturation were recorded during pneumoperitoneum and Trendelenburg positioning. Hemodynamic parameters such as heart rate, systolic blood pressure, diastolic blood pressure, and mean arterial pressure were also evaluated. Safety outcomes including ventilator alarms, intraoperative hypertension episodes, and postoperative pulmonary complications were analyzed. Statistical analysis was performed using appropriate parametric and non-parametric tests. Results: The PCV group demonstrated significantly lower peak airway pressures and higher dynamic lung compliance compared to the VCV group (p < 0.001). Oxygenation and ventilation efficiency remained comparable between the two groups. Hemodynamic parameters were largely similar; however, diastolic blood pressure stability was significantly better in the PCV group (p < 0.001). PCV was associated with fewer airway pressure alarms, reduced need for ventilator adjustments, and a lower incidence of intraoperative hypertension episodes (p < 0.05). Postoperative pulmonary complications were numerically lower in the PCV group, though not statistically significant. Conclusion: Pressure-controlled ventilation provides superior respiratory mechanics and improved intraoperative safety compared to volume-controlled ventilation in robot-assisted pelvic surgeries, while maintaining stable hemodynamics and effective gas exchange. PCV may be considered the preferred ventilation strategy in robotic surgical procedures requiring pneumoperitoneum and steep Trendelenburg positioning. Keywords: Pressure-controlled ventilation. Volume-controlled ventilation. Robot-assisted pelvic surgery.
Page No: 864-869 | Full Text
Original Research Article
CLINICAL PROFILE AND SPECTRUM OF CARDIAC DISEASE IN PREGNANCY IN TERTIARY CARE CENTRE IN NORTH INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.147
Amit varshney1, Shubhangi Gupta2, Preeti Singh
View Abstract
Background: Cardiovascular disease in pregnancy is an important cause of indirect maternal morbidity and mortality, with a shifting pattern in developing countries due to coexistence of rheumatic and congenital heart diseases. Objective: To evaluate the clinical profile and spectrum of heart disease in pregnancy at a tertiary care center. Materials and Methods: This prospective observational study was conducted at a tertiary care hospital from February 2024 to February 2026. A total of 300 pregnant women with diagnosed or suspected cardiac disease were included. Detailed clinical evaluation, electrocardiography, and echocardiography were performed. Data were analyzed using descriptive statistics. Results: Among 300 pregnancies, 13 cases (4.33%) had cardiac disease. Valvular heart disease was the most common (2.00%), with mitral stenosis being predominant (1.33%). Cardiomyopathy was observed in 1.00%, congenital heart disease in 0.67%, while arrhythmia and post-surgical cases each accounted for 0.33%. Most women had vaginal delivery (70%), while 30% underwent cesarean section. Conclusion: Cardiac disease in pregnancy was relatively uncommon but clinically significant. Valvular heart disease remains the predominant lesion. Early diagnosis, multidisciplinary care, and close antenatal monitoring are essential to improve maternal and fetal outcomes. Keywords: Pregnancy; Cardiac disease; Valvular heart disease; Rheumatic heart disease; Cardiomyopathy; Maternal outcome; Echocardiography.
Page No: 870-872 | Full Text
Original Research Article
GENDER VARIATIONS IN CARDIOVASCULAR AUTONOMIC FUNCTION: A CROSS-SECTIONAL STUDY IN MEDICAL STUDENTS
http://dx.doi.org/10.70034/ijmedph.2026.2.148
Bhumi M. Hirani, Hitesh Kumar Solanki
View Abstract
Background: Gender-related differences in cardiovascular autonomic regulation have been widely studied, yet findings remain inconsistent, particularly in young healthy populations. This study aimed to evaluate variations in cardiovascular autonomic function between male and female medical students. Materials and Methods: A cross-sectional study was conducted among 162 undergraduate medical students. Resting heart rate and blood pressure were recorded, along with autonomic function tests assessing parasympathetic activity (deep breathing difference, E:I ratio, 30:15 ratio) and sympathetic activity (blood pressure response to standing, sustained handgrip test, and cold pressor test) using standardized protocols. Data was analysed using independent -t test and Mann – Whitney U test. A p-value ≤ 0.05 was considered statistically significant. Results: Females exhibited a significantly higher resting heart rate compared to males (80.47 ± 14.45 vs 76.17 ± 12.65 bpm; p = 0.036). In contrast, males had significantly higher systolic (120.46 ± 7.14 vs 114.35 ± 6.19 mmHg; p < 0.001) and diastolic blood pressure (77.19 ± 4.20 vs 74.71 ± 5.97 mmHg; p = 0.022). Parasympathetic function parameters, including deep breathing difference, E:I ratio, and 30:15 ratio, were slightly higher in males but did not differ significantly. Similarly, sympathetic function tests, including fall in systolic blood pressure on standing, rise in diastolic blood pressure during handgrip, and rise in systolic blood pressure during cold pressor test, showed no statistically significant gender differences. Conclusion: While significant gender differences were observed in resting cardiovascular parameters, autonomic function indices were comparable between males and females, suggesting similar autonomic regulation in young healthy individuals. Keywords: Cardiovascular autonomic function, gender differences, heart rate variability, sympathetic activity, parasympathetic activity, medical students.
Page No: 873-877 | Full Text
Original Research Article
STUDY OF MICROALBUMINURIA IN OBESE INDIVIDUALS AS AN EARLY INDICATOR OF NEPHROPATHY
http://dx.doi.org/10.70034/ijmedph.2026.2.149
Sanketh Janardhan, Mohit A Kalyankar, Aminta Albert, Relvin Roy, Rabeca Johnson, Sanabil S P, Abdul Haque Usman Pulath Puthanath
View Abstract
Background: Obesity is a major global health concern associated with metabolic, cardiovascular, and renal complications. Increasing evidence suggests that obesity alone, even in the absence of hypertension and diabetes, can lead to early renal damage through mechanisms such as endothelial dysfunction, glomerular hyperfiltration, and chronic inflammation. Microalbuminuria, defined as urinary albumin excretion of 30–300 mg/g creatinine, is a sensitive and early marker of both renal impairment and generalized endothelial dysfunction. The aim is to evaluate endothelial dysfunction due to obesity using microalbuminuria as a marker and to study obesity-induced nephropathy in normotensive, non-diabetic obese individuals. The primary objective is to evaluate microalbuminuria as an early, non-invasive indicator of nephropathy in normotensive, non-diabetic obese individuals. Secondary objectives are to determine the prevalence of microalbuminuria in this population. To assess the correlation between microalbuminuria and Body Mass Index (BMI). To explore the need for routine screening for early detection of subclinical renal impairment. Materials and Methods: This descriptive cross-sectional observational study was conducted in the Department of General Medicine at Sri Chamundeshwari Medical College Hospital & Research Institute over a period of 6 months from November 2025 to April 2026. A total of 87 obese individuals (BMI >23 kg/m² as per Asia-Pacific classification), aged 18–70 years, who were normotensive and non-diabetic were included. Detailed clinical evaluation, anthropometric measurements, and laboratory investigations including fasting blood sugar and HbA1c were performed. Microalbuminuria was assessed using spot urine albumin-to-creatinine ratio (ACR) by Roche immunoturbidimetric method. Microalbuminuria was defined as ACR 30–300 mg/g creatinine. Statistical analysis was carried out using SPSS version 20, with p < 0.05 considered statistically significant. Results: Out of 87 participants, 31 (35.6%) were found to have microalbuminuria. The prevalence increased with rising BMI, from 16.7% in overweight individuals to 56.0% in those with BMI ≥30 kg/m². A statistically significant association was observed between BMI categories and microalbuminuria (p = 0.007). Additionally, a moderate positive correlation was found between BMI and urinary ACR (r = 0.48, p = 0.001), indicating increasing renal involvement with higher adiposity. Conclusion: Microalbuminuria is significantly prevalent among normotensive, non-diabetic obese individuals and correlates positively with BMI. It serves as an early, non-invasive marker of obesity-related endothelial dysfunction and nephropathy. Routine screening for microalbuminuria in obese individuals may facilitate early detection and timely intervention, thereby preventing progression to overt renal disease. Keywords: Obesity, Microalbuminuria, Nephropathy, Endothelial Dysfunction, Body Mass Index (BMI), Albumin-Creatinine Ratio (ACR)
Page No: 878-882 | Full Text
Original Research Article
EFFECTIVENESS OF CALCIUM DOBESILATE COMPARED TO PLACEBO IN CHRONIC VENOUS INSUFFICIENCY: A COMPARATIVE STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.150
Abhishek Gaur, Nidhi Gaur
View Abstract
Background: Chronic venous insufficiency (CVI) refers to a condition in which the venous system fails to effectively return blood to the heart in a unidirectional manner, thereby compromising tissue drainage, temperature regulation, and overall hemodynamic balance, irrespective of body position or physical activity. Several randomized controlled trials have demonstrated the therapeutic benefits of calcium dobesilate in managing CVI. The present study aimed to evaluate and compare the efficacy of calcium dobesilate with placebo in patients diagnosed with chronic venous insufficiency. Materials and Methods: This study was conducted in the Department of Surgery at Muzaffarnagar Medical College, Muzaffarnagar, Uttar Pradesh, India, over a period of two years from March 2020 to March 2022. Results: A total of 150 participants were included and equally divided into two groups. One group received calcium dobesilate (500 mg) twice daily for 8 weeks, while the other group was administered a placebo. Significant improvement (P < 0.05) was observed in symptoms such as pain, leg heaviness, fatigue, tingling, itching, and muscle cramps in the calcium dobesilate group compared to the placebo group. Both patient-reported outcomes and investigator assessments showed better overall improvement in the treatment group. Conclusion: Calcium dobesilate significantly alleviates symptoms in patients with clinically diagnosed CVI, irrespective of concurrent use of compression stockings. Keywords: Calcium dobesilate, Chronic venous insufficiency, Venous disorders.
Page No: 883-886 | Full Text
Original Research Article
A COMPARATIVE EVALUATION OF FISTULECTOMY WITH SPHINCTEROPLASTY VERSUS FISTULECTOMY WITH SPHINCTEROPLASTY COMBINED WITH MARTIUS FLAP REPAIR IN THE MANAGEMENT OF RECTOVAGINAL FISTULA
http://dx.doi.org/10.70034/ijmedph.2026.2.151
Abhishek Gaur, Nidhi Gaur
View Abstract
Background: Surgical intervention remains the primary and definitive treatment for anal fistula, a condition recognized for more than two millennia. Despite advances in operative techniques, management continues to be challenging due to variable recurrence rates and the potential risk of postoperative incontinence. The aim is to compare the efficacy, clinical effectiveness, and postoperative outcomes of fistulectomy with sphincteroplasty versus fistulectomy with sphincteroplasty combined with Martius flap repair. Materials and Methods: This study included a total of 24 patients diagnosed with anal fistula. Among them, 13 patients underwent fistulectomy with sphincteroplasty, while 11 patients were treated with fistulectomy, sphincteroplasty, and Martius flap repair. Patients were evaluated for surgical success, recurrence, and postoperative complications. Results: The success rate in the fistulectomy with sphincteroplasty group was 38.4% (5 patients), with recurrence observed in 8 patients. In contrast, the group treated with the addition of Martius flap repair demonstrated a higher success rate of 72.7% (8 patients). Recurrence occurred in 3 patients in this group; however, these cases were managed conservatively for two weeks, resulting in complete healing. Conclusion: The findings of this study suggest that the incorporation of Martius flap repair with fistulectomy and sphincteroplasty provides superior postoperative outcomes, including lower recurrence rates, reduced risk of incontinence, decreased wound infection, and improved patient satisfaction. Keywords: Anal fistula, Fistulectomy, Sphincteroplasty, Martius flap repair.
Page No: 887-890 | Full Text
Original Research Article
RETROGRADE ANALYSIS OF PREDICTABILITY OF CORONARY CALCIUM SCORE IN ESTABLISHED CORONARY ARTERY DISEASE PATIENTS UNDERGOING CORONARYARTERY BYPASS GRAFTING
http://dx.doi.org/10.70034/ijmedph.2026.2.152
K S Harikrishna, Tella RamaKrishna Dev, Sai Surabhi P, RV Kumar
View Abstract
Background: Coronary artery disease (CAD) remains a leading cause of morbidity and mortality worldwide. Early identification of significant coronary artery involvement is crucial for timely intervention. Coronary artery calcium (CAC) scoring using computed tomography has emerged as a non-invasive tool for assessing atherosclerotic burden and predicting CAD severity. Objectives: The primary objective of this study was to determine the predictability of significant coronary artery disease using coronary calcium scoring (Agatston score). The secondary objectives were to correlate calcium scores with the degree of coronary vessel occlusion and with associated cardiovascular risk factors. Materials and Methods: This observational study included 72 patients diagnosed with triple vessel coronary artery disease who underwent coronary artery bypass grafting (CABG) over a period of six months at our center. All patients were evaluated using conventional coronary angiography and CT coronary angiography for calcium scoring. Demographic data, risk factors, and angiographic findings were recorded and analyzed. Results: Among the study population, 52 (72.2%) were males and 20 (27.8%) were females, with a mean age of 57 years. Diabetes mellitus was present in 61% of patients, hypertension in 68%, and a positive family history of CAD in 64%. A history of smoking and alcohol consumption was noted in 66% and 61% of patients, respectively. Significant left main coronary artery disease (>50% stenosis) was observed in 7% of patients. Approximately 10% of patients had complete occlusion of the left anterior descending artery, while 80–90% stenosis was seen in nearly 65% of patients. Only 7% of patients had low Agatston scores, whereas the majority demonstrated moderate to high scores, indicating a strong association between elevated coronary calcium scores and the severity of coronary artery disease. Conclusion: Coronary calcium scoring is a valuable non-invasive tool for predicting the presence and severity of significant coronary artery disease. Higher Agatston scores correlate well with the degree of vessel occlusion and the presence of conventional cardiovascular risk factors, making it a useful adjunct in risk stratification and clinical decision-making in patients with suspected or established CAD. Keywords: Coronary artery disease, Coronary calcium score, Agatston score, CT coronary angiography, Triple vessel disease, Coronary angiography, Coronary artery bypass grafting (CABG) and Atherosclerosis.
Page No: 891-895 | Full Text
Original Research Article
THE ROLE OF ALBUMIN AND LIPID PROFILE AS BIOMARKERS FOR PREDICTING MORTALITY IN SEPSIS PATIENTS IN INTENSIVE CARE UNIT
http://dx.doi.org/10.70034/ijmedph.2026.2.153
G. Nivetha, K. Sumitra Vellaiammal, K. Sujatha, A. P. Thiyagarajan
View Abstract
Background: Worldwide, sepsis continues to be one of the main reasons for intensive care unit (ICU) admissions and deaths, especially in low- and middle-income countries. There is an urgent need to quickly discover new, reliable and accessible biomarkers to improve patient care by enabling risk stratification. Researchers are now identifying serum albumin and lipid profile parameters as potential prognostic indicators in sepsis. The aim and objective is to assess the ability of serum albumin and lipid profile parameters in predicting death in sepsis patients admitted to an ICU, compared with two well-known severity scores (APACHE II & SOFA). Materials and Methods: This study was performed in the course of twelve months in a tertiary hospital with patients who are were admitted to the intensive care level and have sepsis. The total population of patients included 150 and data collection included: demographic information, comorbid conditions, and number of clinical measurements performed on each patient. The APACHE II and the SOFA score were utilized as a way to measure the severity of the patient's illness. Serum albumin and lipid profile (total cholesterol, HDL-C, LDL-C and triglycerides) were obtained at the time of admission and through successive measurements every 24 hours for three days following admission. All patients were categorized as either survivors or non-survivors and subsequent analyses were performed on these two groups. Results: Older adults (>70 years), people with diabetes mellitus, and those with chronic liver disease displayed considerably higher rates of mortality than younger individuals without diabetes or chronic liver disease. The authors found that non-survivors had considerably higher APACHE II (26.02 ± 5.62 vs 19.20 ± 5.59) and SOFA scores (11.93 ± 3.69 vs 7.34 ± 2.47; p < 0.001) than survivors. Serum albumin concentrations were consistently lower in non-survivors than survivors during their ICU stays. In addition total cholesterol, HDL-C, and LDL-C were significantly lower in non-survivors than in survivors at all study time points, while triglyceride levels had no appreciable relation to mortality. Conclusion: The presence of low serum albumin levels and decreased levels of total cholesterol, high-density lipoprotein cholesterol (HDL-C), and low-density lipoprotein cholesterol (LDL-C) levels in individuals with sepsis have been shown to substantially improve prognostic accuracy through the use of previously established severity scores. These markers have also been suggested to be of significant utility in clinical decision making by giving the physician an accurate assessment of the patient’s prognosis in a cost-efficient manner. Keywords: Sepsis, ICU, Mortality, Albumin, Lipid Profile, HDL, LDL, Biomarkers, APACHE II, SOFA.
Page No: 896-901 | Full Text
Original Research Article
HEMATOLOGICAL PARAMETERS AS PREDICTORS OF DISEASE SEVERITY AND OUTCOMES IN COVID-19 PATIENTS: A PROSPECTIVE OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.154
G. Kayalvizhi, S. Sridevi, D. Dhivya
View Abstract
Background: Coronavirus disease 2019 (COVID-19) has a wide spectrum of clinical manifestations ranging from mild illness to severe disease and death. Early identification of patients at risk of severe disease is crucial for timely intervention. Hematological parameters and inflammatory markers have emerged as potential predictors of disease severity and outcomes. The aim is to evaluate hematological parameters as predictors of disease severity and clinical outcomes in COVID-19 patients. The objective is to assess hematological parameters in patients with COVID-19 infection. To correlate hematological parameters with disease severity. To evaluate the association of hematological parameters with clinical outcomes such as ICU admission, ventilator requirement, and mortality. Materials and Methods: This prospective observational study was conducted on 150 RT-PCR confirmed COVID-19 patients admitted to a tertiary care hospital. Patients were categorized into severe and non-severe groups based on clinical criteria. Hematological parameters including hemoglobin, total leukocyte count, differential counts, platelet count, neutrophil-to-lymphocyte ratio (NLR), and platelet-to-lymphocyte ratio (PLR), along with inflammatory markers such as C-reactive protein (CRP) and serum ferritin, were recorded. Statistical analysis was performed using appropriate tests, and p < 0.05 was considered statistically significant. Results: Severe COVID-19 patients were significantly older and had a higher prevalence of comorbidities. Hematological findings revealed significant leukocytosis, neutrophilia, lymphopenia, and thrombocytopenia. NLR and PLR were significantly elevated in severe cases. Inflammatory markers such as CRP and ferritin were markedly higher in severe patients and non-survivors. These parameters showed strong associations with ICU admission, ventilator requirement, and mortality. Conclusion: Hematological parameters, particularly lymphocyte count, NLR, PLR, CRP, and ferritin, are valuable predictors of disease severity and outcomes in COVID-19 patients. These readily available markers can aid in early risk stratification and improve clinical management. Keywords: COVID-19; Neutrophil-to-Lymphocyte Ratio; Hematological Parameters.
Page No: 902-909 | Full Text
Original Research Article
IMPACT OF LOW-PRESSURE VERSUS STANDARD-PRESSURE PNEUMOPERITONEUM IN LAPAROSCOPIC CHOLECYSTECTOMY: AN OBSERVATIONAL CLINICAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.155
Rafiya Afreen, Pushpa Satish Kumar, Nithya T
View Abstract
Background: Laparoscopic cholecystectomy is the gold standard for the management of symptomatic cholelithiasis. The creation of pneumoperitoneum is essential for adequate visualization during the procedure, traditionally achieved using standard intra-abdominal pressures (12–14 mmHg). However, standard-pressure pneumoperitoneum has been associated with postoperative shoulder tip pain, cardiopulmonary alterations, and other physiological disturbances. Low-pressure pneumoperitoneum has been proposed as an alternative approach to minimize these adverse effects while maintaining surgical safety and efficacy. The primary objective of this study was to compare the incidence of postoperative shoulder tip pain between low-pressure and standard-pressure pneumoperitoneum in laparoscopic cholecystectomy. Secondary objectives included comparison of duration of surgery, intraoperative complications, and postoperative complications between the two groups. Materials and Methods: This observational clinical study was conducted in the Department of General Surgery at Dr. B.R. Ambedkar Medical College and Hospital, Bengaluru. Patients diagnosed with cholelithiasis and undergoing laparoscopic cholecystectomy were included and divided into two groups: Group A (low-pressure pneumoperitoneum) and Group B (standard-pressure pneumoperitoneum). Demographic data, operative details, and clinical outcomes were recorded. Outcome measures included postoperative shoulder tip pain assessed using the Visual Analogue Scale (VAS), duration of surgery, intraoperative complications such as bile spillage and bleeding, and postoperative complications. Patients were followed up for pain assessment up to 7 days postoperatively. Results: Both groups were comparable in baseline characteristics, with a predominantly female population in each group. The mean age was slightly higher in Group A (38.8 years) compared to Group B (37.9 years). The mean duration of surgery was longer in the low-pressure group (77.83 minutes) compared to the standard-pressure group (58 minutes). Postoperative shoulder tip pain was initially higher in the low-pressure group; however, pain scores between the groups became comparable by 24 hours and resolved completely in both groups by postoperative day 7. Intraoperative complications, including bile spillage and bleeding, were slightly more frequent in the low-pressure group, but the differences were not statistically significant. No significant difference was observed in postoperative complications between the two groups. Conclusion: Low-pressure pneumoperitoneum in laparoscopic cholecystectomy is a safe and feasible alternative to standard-pressure pneumoperitoneum. Although it is associated with a longer operative time and slightly higher initial postoperative pain, overall outcomes including complication rates are comparable between the two approaches. Low-pressure pneumoperitoneum can be considered in selected patients to minimize physiological disturbances without compromising surgical safety. Keywords: Laparoscopic cholecystectomy, Low-pressure pneumoperitoneum, Standard-pressure pneumoperitoneum, Shoulder tip pain, Operative time, Surgical outcomes.
Page No: 910-916 | Full Text
Original Research Article
A CROSS SECTIONAL STUDY ON THE ASSOCIATION BETWEEN SERUM IONIZED CALCIUM LEVEL AND SEVERITY OF DENGUE FEVER
http://dx.doi.org/10.70034/ijmedph.2026.2.156
Divyansh Baranwal, Puneet Tripathi, Meraj Rasool
View Abstract
Background: Aim: To study the association between serum ionized calcium level and severity of dengue fever. Materials and Methods: A hospital-based cross-sectional study was conducted at T.S. Misra Medical College, Lucknow. One hundred serologically confirmed adult dengue patients (NS1/IgM positive) were enrolled. Patients were classified according to WHO 2009 criteria into dengue without warning signs, dengue with warning signs, and severe dengue. Serum ionized calcium was measured using an ion-selective electrode method. Hematological and biochemical parameters were recorded, and associations were analyzed using Chi-square and descriptive statistics. Results: The mean age was 35.1 years, with male predominance (68%). Severe dengue was observed in 22% of cases, dengue with warning signs in 42%, and dengue without warning signs in 36%. These were corelated with serum ionized calcium levels. Hypocalcemia was a prominent finding (< 0.001). Patients with hypocalcemia had higher hematocrit (48.8 ± 3.9%) and lower platelet counts (49,000 ± 21,000/μL) compared to patients having normal ionic calcium levels. Conclusion: Hypocalcemia is significantly associated with severe dengue and correlates with hematocrit rise and thrombocytopenia. Routine monitoring of ionized calcium may serve as a simple, cost-effective prognostic marker for identifying patients at risk of complications. Larger multicentric studies are warranted to validate its role in clinical management. Keywords: Dengue fever, Ionized calcium, Hypocalcemia, Disease severity, Prognostic marker.
Page No: 917-921 | Full Text
Original Research Article
HEARING IMPAIRMENT AND ITS ASSOCIATED FACTORS AMONG PRE UNIVERSITY COLLEGE STUDENTS USING PERSONAL LISTENING DEVICES IN KOLAR – A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.157
Angayarkanni.P, Vishwas.S., Ravi Shankar.S, Ujval.M
View Abstract
Background: Hearing loss is a growing public health concern in adolescents and young adults in part due to unsafe listening practices associated with personal listening devices (PLDs). Recreational noise exposure because of extended and high-volume PLD use is a well-known preventable risk factor for noise-induced hearing loss. Objectives: To estimate the prevalence of likely hearing impairment using hearWHO app among PLD-using PU college students. To identify the factors associated with likely hearing impairment. Materials and Methods: A cross-sectional analytical study was carried out among 360 Preuniversity college students aged 15-18 years who are regular users of PLDs. Students were selected from government and private colleges by the stratified random sampling. Information was gathered using a predesigned semi structured questionnaire validated questionnaire that included sociodemographic profile, pattern of use of PLD and awareness. Hearing screening was based on the hearWHO mobile app, with results as follows: hearing >75 (good hearing), 50–75 (follow-up needed), and <50 (suspected hearing impairment). Data was analyzed by SPSS 22. Results: About 61% of participants were male and 53% were aged 17-18 years. PLD was mostly used for entertainment (50.7%), 60% had a daily PLD use of ≥4 h and 17.8% listened at high volumes. HearWHO screening found that 23.5% of the students likely had hearing impairment and 53.7% needed to be followed regularly. Associated factors of likely hearing impairment was male gender (OR=2.67), positive family history of hearing impairment (OR=5.94), daily use over 4 hours/day (OR=1.97), high-volume listening (OR=4.58) and the absence of listening breaks (OR=3,49). Conclusion: Despite its high educational level, a large proportion of our PU students appeared to present with incipient hearing impairment related to modifiable listening habits. Mobile-based apps such as the hearWHO application can provide an inexpensive method for early screening. Prevention of future hearing impairment requires an inclusion of safe listening education and regular hearing screening in school health programmes. Keywords: Audiometry, PLD, Noise, Hearing health.
Page No: 922-927 | Full Text
Original Research Article
FUNCTIONAL AND RADIOLOGICAL OUTCOME OF UNSTABLE DISTAL RADIUS FRACTURES TREATED WITH VOLAR PLATING-A RETROSPECTIVE STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.158
G Nagaraju, Ebel Raj N, Amal PS, Narayana Rao, Saidath Tentu
View Abstract
Background: The purpose of the study was to assess the functional and radiological outcome following surgical management of unstable distal end radius fractures with volar T- plating. Materials and Methods: In this study 25 patients with unstable fracture of the distal end of radius satisfying the inclusion criteria were treated surgically by ORIF with a volar T plate. Patient follow-up was done at 1, 4 and 8 months to assess outcome radiologically and clinically on the basis of range of movements of the wrist, grip strength, and modified Green and O’brien scores. A detailed analysis of complications was also performed. Results: Results were graded as excellent, good, fair and poor on the basis of modified Green and O’brien score. Out of total 25 patients, 16 patients (64%) achieved an excellent score, 6 patients (24%) had good outcome, 2patients (8%) had fair score and one (4%) had poor score. All of them attained good bony union. Two of them had screw penetration into the joint and one had infection at surgical site and resulted in fair to poor results in these patients. No wrist arthritis was reported in this study. Conclusion: Patients with unstable and dorsally displaced fractures of the distal end of radius treated surgically with volar T plating had excellent to good functional results. But these methods can have complications like screw penetration to joint, extensor tendon irritation, infection etc. Keywords: grip strength, T-plate, modified Green and O’brien score, unstable distal radius fractures, volar plating.
Page No: 928-932 | Full Text
Original Research Article
IMPLEMENTATION AND EVALUATION OF MENTOR- MENTEE PROGRAMME AT NEWLY ESTABLISHED MEDICAL COLLEGE IN WESTERN INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.159
Pramod Jaysingrao Shinde, Archana Shinde
View Abstract
Background: Exposure to new environment and professional course is having varied impact on the medical students. Teacher and student relationship is important for better academic performance as well as social and psychological wellbeing of medical students. The mentor-mentee program helps students in career development by increasing their interest in subject, inspiring them for doing research and guiding them for professional development. It also help them emotionally by providing mental support to relieve anxiety and stress. Aim: To implement and assess the effectiveness of mentor-mentee program. Objectives: To assess academic performance of students for measuring effectiveness of the program and to assess the perception of students and mentors about the mentor-mentee program Materials and Methods: The present study is interventional and prospective which was conducted at tertiary care hospital and newly established medical college in western India from August 2024 to December 2024. All second phase MBBS students were included after taking informed consent. Total 322 students, 181 students (Batch2022-23) and newly joined 141 (Batch 2023-24) were included. Students were divided in groups of 12 for each mentor. The mentor mentee meetings were held once in a month. Quantitative data was collected from the records of attendance and marks and Qualitative data was collected from mentees and mentors through questionnaire. Results: Statistically significant improvement (p value <0.001) was noted in attendance and marks. Majority of mentees felt supported in academic and personal growth (90.18%) and received encouragement and motivation (90.18%). While 70.09% acknowledged emotional support, 87.50% agreed their mentor suggested useful resources. A majority of mentors (77.78%) agreed that the mentorship program helped build stronger relationships with their mentees and approximately 77% showed interest for mentoring in future. Conclusion: Appropriate implementation of mentorship program have boosted academic performance amongst students as well as build stronger relationship with mentors. It is also helpful in emotional support. Keywords: Mentor, Mentee, Mentorship.
Page No: 933-938 | Full Text
Original Research Article
ASSOCIATION BETWEEN DURATION OF SMOKELESS TOBACCO USE, VAPING, AND INTER-INCISAL MOUTH OPENING IN PATIENTS WITH ORAL SUBMUCOUS FIBROSIS: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.160
Ana Luiza Monroy Francisconi Volles, Asadullah Shakeel, Mateen Ahmed Khan, Irfan Qureshi, FaisalAsghar, Zubia Waqar
View Abstract
Background: Oral submucous fibrosis (OSMF) is a chronic potentially malignant disorder strongly associated with smokeless tobacco use and characterized by progressive reduction in mouth opening. The objective is to assess the association between duration of smokeless tobacco use, vaping, and inter-incisal mouth opening in patients with oral submucous fibrosis. Materials and Methods: This was a cross-sectional study conducted at Tertiary Care Hospital Karachi, from March 2024 to July 2025, including 355 patients diagnosed with oral submucous fibrosis. Results: The mean age was 31.7 ± 9.4 years, and mean duration of smokeless tobacco use was 8.9 ± 4.6 years. Mean inter-incisal mouth opening was 27.8 ± 7.1 mm. Mouth opening declined significantly with increasing duration of tobacco use, from 33.4 ± 4.9 mm in patients with ≤5 years exposure to 21.4 ± 4.8 mm in those with >10 years (p<0.001). Vaping users had lower mouth opening than non-users (24.2 ± 5.9 vs 29.1 ± 6.8 mm; p=0.011). Severe restricted mouth opening was significantly associated with duration >10 years (58.3% vs 21.4%; p<0.001), Grade III–IV OSMF (78.6% vs 24.0%; p<0.001), and vaping exposure (36.9% vs 23.2%; p=0.028). Conclusion: Longer duration of smokeless tobacco use, higher exposure frequency, and vaping were significantly associated with reduced mouth opening and greater severity of OSMF. Early risk factor modification may help reduce progression of functional impairment. Keywords: Oral submucous fibrosis, smokeless tobacco, vaping, inter-incisal mouth opening, trismus.
Page No: 939-943 | Full Text
Original Research Article
UTILITY OF ECHOCARDIOGRAPHY IN EARLY DIAGNOSIS AND PREVENTION OF CARDIAC REMODELING IN RHEUMATIC HEART DISEASE: A PROSPECTIVE STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.161
Prakhar Pandey, Aanchal Sirohi, Aakash Yadav
View Abstract
Background: Rheumatic heart disease (RHD) remains a major public health problem in developing countries, leading to significant morbidity and mortality among children and young adults. Early detection of valvular involvement and prevention of cardiac remodeling are essential to improve clinical outcomes. The aim is to assess the utility of echocardiography in the early diagnosis and prevention of cardiac remodeling in patients with rheumatic heart disease. Materials and Methods: This prospective observational study was conducted in a tertiary care hospital in Delhi from December 2022 to December 2024. A total of 1494 patients were included. All participants underwent detailed clinical evaluation and echocardiographic assessment using standard (STAND) and handheld (HAND) echocardiography. Results: Out of 1494 participants, 1238 (82.9%) were normal, 180 (12.0%) had borderline RHD, and 76 (5.1%) had definite RHD. Mitral regurgitation was the most common lesion, with severe MR predominantly observed in definite RHD cases (86.8%). Conclusion: Echocardiography is a highly sensitive and reliable modality for early detection of RHD and assessment of cardiac remodeling. Keywords: Rheumatic heart disease, Echocardiography, Cardiac remodeling, Mitral regurgitation, Handheld echocardiography, Screening.
Page No: 944-949 | Full Text
Original Research Article
FIXATION OF NON-THUMB METACARPAL FRACTURES USING THREE-POINT FIXATION PRINCIPLE WITH MULTIPLE PRE-BENT K-WIRES: SURGICAL TECHNIQUE AND CLINICAL RESULTS STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.162
Amit B. Garud, Prashant Pandey, Nishant Gaonkar
View Abstract
Background: Metacarpal fractures are among the most common hand injuries, accounting for 18–44% of all hand fractures, with non-thumb metacarpals being the most frequently involved. While many fractures can be managed conservatively, unstable, displaced, or rotational deformities often require surgical fixation. Various fixation techniques exist, with Kirschner wire (K-wire) fixation being widely used due to its simplicity and minimal soft tissue disruption. This study evaluates a modified technique using two pre-bent K-wires based on the principle of three-point fixation. To assess the functional and clinical outcomes of non-thumb metacarpal fractures treated with two pre-bent intramedullary K-wires. Materials and Methods: A prospective study was conducted on 18 hands of 17 adult patients with 35 non-thumb metacarpal fractures treated between July 2018 and December 2019 at a tertiary care center. Inclusion criteria included extra-articular fractures with operative indications such as angulation and rotational deformity. Patients were treated using a technique involving two or occasionally three pre-bent K-wires inserted through separate entry points. Postoperative immobilization was followed by early mobilization. Functional outcomes were assessed using the Quick Disabilities of the Arm, Shoulder and Hand (Q-DASH) score and Total Active Motion (TAM) at 6 weeks, 3 months, and 6 months. Results: The mean age of patients was 41.6 years, with a male predominance (70%). The fifth metacarpal was most commonly affected. Significant improvement in functional outcomes was observed over time. Mean Q-DASH scores improved from 44.56 at 6 weeks to 20.08 at 3 months and 8.07 at 6 months (p < 0.05). Most patients achieved good to excellent range of motion by 3 months and near-normal grip strength by 6 months. Complications were minimal, with only two cases of superficial pin tract infection managed conservatively. Conclusion: The use of two pre-bent intramedullary K-wires provides a simple, minimally invasive, and effective method for stabilizing non-thumb metacarpal fractures. The technique offers good functional outcomes, low complication rates, and allows early mobilization, making it a viable alternative to conventional fixation methods. Keywords: Metacarpal fractures, non-thumb metacarpal fractures, three-point fixation, Kirschner wires, intramedullary fixation.
Page No: 950-954 | Full Text
Original Research Article
EFFECT OF SELF-CARE MANAGEMENT ON QUALITY OF LIFE AMONG CHRONIC OBSTRUCTIVE PULMONARY DISEASE PATIENTS: A PROSPECTIVE STUDY FROM CENTRAL INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.163
Manyaa Singh, Neeraj Mer, Kshipra Srivastava
View Abstract
Background: Chronic obstructive pulmonary disease (COPD) is a leading cause of morbidity and mortality globally, placing a substantial burden on patients and healthcare systems. In India, the prevalence of COPD ranges from 3.49% to 16.05%, with projections indicating a rise from 9 lakh to 16 lakh cases between 2005 and 2030. COPD profoundly impairs quality of life (QoL) across physical, psychological, and social domains. Self-care management—encompassing medication adherence, pulmonary rehabilitation, nutritional optimization, and lifestyle modification—has emerged as a cornerstone intervention to mitigate disease burden and improve QoL. Despite robust international evidence, data from resource-limited Indian settings remain limited, highlighting the need for context-specific investigation. Objectives: This study aimed to assess baseline self-care practices and QoL among COPD inpatients, and to evaluate the impact of structured self-care counselling on QoL after three months of home-based follow-up. Materials and Methods: A prospective observational study was conducted at the School of Excellence in Pulmonary Medicine (SEPM), Netaji Subhash Chandra Bose Medical College and Hospital, Jabalpur, Madhya Pradesh, from December 2023 to May 2024. Fifty COPD inpatients diagnosed per GOLD criteria were enrolled using consecutive sampling. Data were collected using a structured sociodemographic and self-care questionnaire and the validated St. George's Respiratory Questionnaire (SGRQ). After baseline assessment, patients received individualized self-care counselling and were followed up at three months. Pre- and post-intervention self-care scores and SGRQ scores were compared using paired t-test and Wilcoxon signed-rank test. Pearson's correlation coefficient assessed the relationship between self-care scores and SGRQ scores. Results: All 50 patients (90% male; 100% aged >40 years) were enrolled. Significant improvements were observed in self-care practices: yoga (14% to 86%), regular walking (34% to 88%), adequate sleep (56% to 88%), water intake (56% to 88%), avoidance of processed food (28% to 82%), and indoor irritant management (50% to 92%) (p<0.001 for each). The mean self-care score increased from 15.18±1.56 to 19.62±1.19 (p<0.001). Total SGRQ score declined from 63.1 (IQR: 47.20–72.40) to 41.3 (IQR: 31.23–48.83) (p<0.001), reflecting significant QoL improvement. Counselling was very efficacious in 68%, moderately efficacious in 8%, and had no effect in 18% of participants. A statistically significant negative moderate correlation was identified between self-care score and total SGRQ score (r = −0.542, p<0.001). Conclusion: Structured self-care counselling and home-based follow-up significantly improved self-care practices and quality of life among COPD patients in a resource-limited Indian setting. Integration of self-care education into routine clinical practice represents a cost-effective strategy to reduce disease burden. Keywords: COPD, self-care management, quality of life, SGRQ, pulmonary rehabilitation, India.
Page No: 955-960 | Full Text
Original Research Article
DECODING MYOMETRIAL PATHOLOGY: A CLINICO-HISTOMORPHOLOGICAL AND IMMUNOHISTOCHEMICAL INSIGHT INTO SMOOTH MUSCLE NEOPLASMS, ESPECIALLY LEIOMYOMAS
http://dx.doi.org/10.70034/ijmedph.2026.2.164
Abhishek Kumar Gupta, Pradeep Kumar Sharma, Alok Mohan, Kamna Gupta, Bharti Maheshwari
View Abstract
Background: Uterine myometrial lesions encompass a broad spectrum of non-neoplastic and neoplastic conditions and are a significant cause of gynecological morbidity in women. These lesions are among the most common indications for hysterectomy. Smooth muscle tumors, particularly leiomyomas, represent the most frequent benign neoplasms of the uterus. Histopathological examination plays a crucial role in identifying the spectrum of myometrial lesions and in differentiating benign from malignant smooth muscle tumors. The aim and objective is to analyze the clinico-histomorphological spectrum of uterine myometrial lesions in hysterectomy specimens with special reference to smooth muscle neoplasms. Immunohistochemical profile of leiomyomas also studied. Materials and Methods: This descriptive study was conducted on 300 hysterectomy specimens received in the Department of Pathology. Relevant clinical details including age, presenting complaints, and indications for hysterectomy were recorded. All specimens were subjected to detailed gross examination followed by routine histopathological processing and microscopic evaluation using hematoxylin and eosin-stained sections. The lesions were categorized and analyzed according to their histomorphological features. Immunohistochemical profile of leiomyomas also studied for Estrogen receptor ER, Progesterone receptor PR and CD 34 was performed. Results: Among the 300 cases studied, leiomyoma was the most common lesion observed, accounting for 46.3% of cases. Adenomyosis was the second most common lesion seen in 29.3% of cases, while coexistent leiomyoma with adenomyosis was noted in 22% of cases. Other rare myometrial lesions constituted 2.4% of the cases. Leiomyomas were predominantly observed in women of reproductive age group and commonly presented with symptoms such as menorrhagia, dysmenorrhea, and abdominal mass. Various histomorphological patterns of leiomyoma were identified, whereas malignant and other rare mesenchymal tumors were encountered infrequently. Immunohistochemical expressions of Estrogen Receptor (ER), Progesterone Receptor (PR) and mean blood vessels density assessed by CD-34 were also observed in cases of leiomyomas. Conclusion: Leiomyoma remains the most common uterine myometrial lesion and a major indication for hysterectomy. Comprehensive histopathological evaluation is essential for identifying variant patterns and differentiating benign from malignant smooth muscle tumors, thereby ensuring accurate diagnosis and appropriate patient management. Keywords: Adenomyosis; Histopathology; Hysterectomy; Immunohistochemistry, Leiomyoma; Smooth muscle tumors.
Page No: 961-967 | Full Text
Original Research Article
SERUM ZINC AND SELENIUM STATUS AND LIPID PROFILE ABNORMALITIES IN CRITICALLY ILL PATIENTS WITH ACUTE MYOCARDIAL INFARCTION: A CROSS-SECTIONAL OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.165
Sheetalkumari Paresh Patel, Kalpeshkumar Chhaganbhai Nakarani, Kapilkumar A. Shahane, Vilas U. Chavan
View Abstract
Background: Myocardial infarction (MI) remains a leading cause of morbidity and mortality worldwide, with a rising burden in low- and middle-income countries such as India. Beyond traditional cardiovascular risk factors, oxidative stress and systemic inflammation play key roles in the pathogenesis and severity of MI. Essential trace elements such as zinc and selenium are critical components of antioxidant defense mechanisms, and their deficiency may aggravate myocardial injury. However, data evaluating trace element status and its association with lipid abnormalities in critically ill MI patients from the Indian population are limited. This study assessed serum zinc and selenium levels in critically ill patients with acute myocardial infarction, to compare lipid profile parameters between MI patients and healthy controls, and to evaluate the relationship between trace element levels and lipid profile abnormalities. Materials and Methods: This hospital-based cross-sectional observational study included 50 critically ill patients diagnosed with acute myocardial infarction admitted to the intensive care unit and 50 age- and sex-matched healthy controls. Serum zinc and selenium levels were measured using atomic absorption spectrophotometry, and lipid profile parameters were estimated using standard enzymatic methods. Data were expressed as mean ± standard deviation, and appropriate statistical tests were applied. Correlation analysis was performed to assess relationships between trace elements and lipid parameters. Results: The mean age of MI patients was 56.08 ± 10.18 years, with a male predominance (60%). MI patients demonstrated significant dyslipidemia, with higher total cholesterol, LDL cholesterol, and triglyceride levels compared to controls (p < 0.05). Serum selenium (77.46 ± 13.46 μg/L) and zinc (53.79 ± 25.08 μg/dL) levels were significantly lower in MI patients than in controls (p < 0.01). Selenium deficiency was observed in 76% and zinc deficiency in 86% of MI patients. No significant correlations were found between serum zinc or selenium levels and lipid profile parameters, while strong positive correlations were observed among lipid parameters themselves. Conclusion: Critically ill patients with acute myocardial infarction exhibit significant dyslipidemia along with profound deficiencies of zinc and selenium. These abnormalities appear to coexist rather than demonstrate a direct interaction, suggesting that micronutrient deficiency may contribute to myocardial injury primarily through oxidative and inflammatory pathways. Routine assessment of trace element status in critically ill MI patients may provide additional insights into disease severity and guide future interventional strategies. Keywords: Myocardial infarction; Zinc; Selenium; Dyslipidemia; Trace elements; Oxidative stress; Critically ill patients; Lipid profile.
Page No: 968-974 | Full Text
Original Research Article
INTEGRATED EVALUATION OF SYNDROMIC, LABORATORY, AND MATERNAL–INFANT SYPHILIS INDICATORS AT A DESIGNATED STI/RTI CLINIC
http://dx.doi.org/10.70034/ijmedph.2026.2.166
Sumedh R. Khandare, Sandeep B. Hade, Prashant Deepak Ukey, Rachana Laul, Abhijit Deshmukh
View Abstract
Background: Sexually transmitted infections (STIs) and reproductive tract infections (RTIs) continue to be major public health concerns. Monitoring performance indicators from Designated STI/RTI Clinics (DSRCs) is essential for assessing service delivery effectiveness and progress toward syphilis control and elimination of mother-to-child transmission (EMTCT). This study provides a comprehensive programmatic evaluation of a DSRC, focusing on service delivery, syndromic burden, HIV and syphilis screening, partner management, and maternal–infant syphilis prevention. Materials and Methods: A retrospective, facility-based descriptive programmatic evaluation was conducted using aggregated monthly DSRC data derived from STI service delivery and syndromic reporting formats, laboratory HIV and syphilis records, and antenatal care (ANC) follow-up data. Indicators assessed included service coverage, STI syndromic caseloads, HIV and syphilis test results, partner notification outcomes, and maternal–infant syphilis cascade indicators. Results: Among 5,518 STI attendees, counselling and screening coverage was 100%, with 89.8% receiving condoms and 68.5% undergoing risk assessment. A total of 3,267 syndromic STI/RTI cases were identified, predominantly lower abdominal pain (1,038) and vaginal/cervical discharge (769). HIV screening was universal, with a 0.36% positivity rate (20 cases). Syphilis screening achieved 100% coverage, with 26 reactive cases, all treated. HIV–syphilis co-infection was noted in 5 cases. Partner management was highly effective (99.5%). Among 5,670 pregnant women, universal syphilis screening identified 3 cases, all treated; however, one congenital syphilis case was reported. Conclusion: The DSRC demonstrated strong programmatic performance in counselling, universal screening, and partner management. However, persistent syndromic burden and incomplete documentation of risk assessment highlight areas for improvement. Strengthening early ANC engagement, improving documentation quality, and expanding outreach to high-risk and marginalized populations are required to further advance syphilis control and EMTCT goals. Keywords: STI, RTI, DSRC, syphilis, HIV screening, congenital syphilis, partner notification, program evaluation.
Page No: 975-979 | Full Text
Original Research Article
CLINICAL PROFILE AND EARLY POSTNATAL OUTCOMES OF INFANTS WITH PRENATALLY DETECTED HYDRONEPHROSIS: A RETROSPECTIVE COHORT STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.167
Suryaprakash T, Mahesh Khanna S A, Deepak Chakravarthy SS, Arunkumar Raju
View Abstract
Background: Prenatal hydronephrosis (PNH) is one of the most commonly detected anomalies on antenatal ultrasonography and may represent a spectrum of congenital anomalies of the kidney and urinary tract (CAKUT). While many cases resolve spontaneously, a subset is associated with urinary tract infections (UTIs), structural abnormalities, and need for surgical intervention. Objectives: To evaluate the clinical profile, imaging characteristics, and early postnatal outcomes of infants with prenatally detected hydronephrosis, and to assess the association between urinary tract dilation (UTD) grading and clinical outcomes. Materials and Methods: This retrospective observational cohort study was conducted at the Department of Pediatrics, Swamy Vivekanandha Medical College Hospital & Research Institute (SVMCHRI), Tamil Nadu, India, from April 2022 to October 2025. Infants with antenatally detected hydronephrosis (anteroposterior pelvic diameter ≥7 mm at 28–36 weeks gestation) and available postnatal imaging were included. Data on demographic characteristics, antenatal findings, postnatal imaging, and outcomes were collected. Statistical analysis was performed using SPSS v26, with p<0.05 considered significant. Results: A total of 200 infants were included, with a male predominance (69%). Hydronephrosis was unilateral in 66% of cases, and the majority were classified as low-risk UTD P1 (56%). Structural urinary tract anomalies were identified in 32% of infants, with ureteropelvic junction obstruction being the most common (15%). Urinary tract infection occurred in 21% of infants, while 15% required surgical intervention. Overall, 66% of cases showed spontaneous resolution. Higher UTD grades (P2–P3) were significantly associated with increased risk of UTI (44%), need for surgery (50%), and lower resolution rates (25%) (p<0.001). Conclusion: Prenatally detected hydronephrosis generally has a favorable prognosis; however, higher UTD grades and associated structural anomalies are important predictors of adverse outcomes. Risk stratification using UTD classification is essential for guiding follow-up and management. Keywords: Prenatal hydronephrosis, urinary tract dilation, UTI, UPJO, VUR, infants, cohort study.
Page No: 980-987 | Full Text
Original Research Article
A STUDY ON USE OF SMART-COP SCORE IN PREDICTING THE SEVERITY OUTCOMES AMONG PATIENTS WITH COMMUNITY-ACQUIRED PNEUMONIA
http://dx.doi.org/10.70034/ijmedph.2026.2.168
Sireesha Ella, Kulakarni Jayanth, K. Siva Prasad
View Abstract
Background: Community-acquired pneumonia (CAP) remains a leading cause of morbidity and mortality worldwide. Early identification of patients at risk for severe outcomes is essential for timely intervention. Conventional scoring systems such as CURB-65 and Pneumonia Severity Index (PSI) have limitations in predicting the need for intensive respiratory or vasopressor support (IRVS). The SMART-COP score was developed to better identify such high-risk patients. The objective is to evaluate the effectiveness of the SMART-COP score in predicting severity outcomes among patients with CAP and compare its performance with CURB-65 and PSI. Materials and Methods: This hospital-based cross-sectional study was conducted at a tertiary care center over 18 months (June 2023–December 2024). A total of 100 adult patients diagnosed with CAP were included. Clinical evaluation, laboratory investigations, and chest radiography were performed. Severity was assessed using SMART-COP, CURB-65, and PSI scores. Outcomes measured included ICU admission, need for IRVS, ventilatory support, and 30-day mortality. Statistical analysis was performed using SPSS version 26, with ROC curve analysis used to evaluate predictive performance. Results: The mean age was 57.44 ± 11.99 years, with 62% males. Based on SMART-COP, 57% were low risk, while 30% were categorized as high or very high risk. ICU admission was required in 31% of patients, and 22% required IRVS. The mortality rate was 9%. SMART-COP demonstrated superior predictive performance for ICU admission (AUC = 0.973), IRVS (AUC = 0.991), and intubation (AUC = 0.998) compared to PSI and CURB-65. It also showed excellent sensitivity for mortality prediction. Conclusion: SMART-COP is a reliable and superior tool for predicting severe outcomes in CAP patients. Its use in routine clinical practice can improve early risk stratification, guide timely intervention, and optimize patient outcomes. Keywords: Community-acquired pneumonia, SMART-COP, CURB-65, Pneumonia Severity Index, ICU admission, IRVS, Severity scoring.
Page No: 988-993 | Full Text
Original Research Article
ASSOCIATION BETWEEN PROLONGED QTC INTERVAL AND MICROALBUMINURIA IN PATIENTS WITH TYPE 2 DIABETES MELLITUS: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.169
Sireesha Ella, Sobhanadri Naga Subhendra Nath, K. Siva Prasad
View Abstract
Background: Type 2 diabetes mellitus (T2DM) is associated with multiple microvascular complications, including cardiac autonomic neuropathy (CAN) and diabetic nephropathy. Prolonged QTc interval is a non-invasive marker of CAN, while microalbuminuria reflects early renal involvement and endothelial dysfunction. The aim is to assess the association between prolonged QTc interval and microalbuminuria in patients with T2DM. Materials and Methods: This analytical cross-sectional study was conducted at Alluri Sitarama Raju Academy of Medical Sciences (ASRAM), Eluru, India, over 18 months. A total of 100 T2DM patients aged 50–79 years with microalbuminuria were included. QTc interval was measured using a 12-lead electrocardiogram and corrected using the modified Bazett’s formula. Microalbuminuria was assessed using the urine albumin-to-creatinine ratio (ACR) and categorized into Grade I and Grade II. Statistical analysis was performed using IBM SPSS version 26, with Chi-square test applied for associations. Results: The mean age of participants was 63.47 ± 8.50 years, with a male predominance (57%). QTc prolongation was observed in 68% of patients. Grade II microalbuminuria was present in 56% of cases. A highly significant association was found between QTc prolongation and microalbuminuria severity (χ² = 26.50, p < 0.0001). QTc prolongation also showed a significant association with duration of diabetes (χ² = 8.730, p = 0.015), with increasing prevalence in patients with longer disease duration. No significant gender association was observed. Conclusion: In T2DM patients, a prolonged QTc interval is strongly linked to microalbuminuria and the length of diabetes, suggesting a strong connection between cardiac autonomic dysfunction and early diabetic nephropathy. Early identification and risk stratification can be facilitated by routine ECG and microalbuminuria monitoring. Keywords: Type 2 diabetes mellitus, QTc prolongation, microalbuminuria, cardiac autonomic neuropathy, diabetic nephropathy
Page No: 994-999 | Full Text
Original Research Article
SPECTRUM OF OCCUPATIONAL DISEASES AND RESPIRATORY HEALTH AMONG BRICK KILN WORKERS IN KARNATAKA: A CROSS-SECTIONAL ANALYSIS
http://dx.doi.org/10.70034/ijmedph.2026.2.170
Biswabinod Sanfui, Avinash Dubey, Prachi Jasuja, Manami Das Sanfui
View Abstract
Background: India's brick manufacturing industry employs over 10 million workers facing significant occupational health hazards, yet comprehensive health data remain limited, particularly in Karnataka. The objective is to assess the prevalence of occupational health disorders, emphasizing respiratory dysfunction among brick factory workers in Malur Taluk, Karnataka, and to examine dose-response relationships between exposure duration and pulmonary outcomes. Materials and Methods: A cross-sectional study of 239 brick factory workers (≥18 years) from 17 randomly selected factories in Malur Taluk, Kolar district, Karnataka was conducted. Structured questionnaires captured sociodemographic, occupational, and health data. Spirometry (using Contec SP10) was performed as per ATS/ERS guidelines measured FEV₁, FVC, and PEFR as percent-predicted values. Statistical analysis included ANOVA, Pearson correlation, chi-square test, and linear trend analysis. Results: The workforce comprised 160 males (67%) and 79 females (33%); mean age 34.2 ± 11.8 years. Spirometric abnormalities were detected in 76.6% of workers, with female workers (86.1%), undernourished individuals (94.4%), and those with >10 years exposure (94.8%) most affected. FEV₁ declined from 74.5 ± 18.2% predicted (≤1 year) to 57.3 ± 26.1% (>10 years; p<0.001); PEFR showed the steepest fall (76.9% → 56.5%; Δ = −20.4%). Strong negative Pearson correlations confirmed cumulative lung damage: PEFR (r = −0.389, p<0.001), FEV₁ (r = −0.342, p<0.001). Musculoskeletal complaints affected 33.1% of workers. Conclusion: Brick factory workers in Karnataka face a severe occupational health crisis, with progressive respiratory impairment directly proportional to exposure duration. Comprehensive interventions including dust control, mandatory spirometry surveillance, PPE enforcement, and manufacturing modernization are urgently required. Keywords: Brick kiln workers; occupational hazards; respiratory dysfunction; spirometry; musculoskeletal disorders; FEV₁; FVC; PEFR.
Page No: 1000-1005 | Full Text
Original Research Article
PREVALENCE OF METABOLIC DYSFUNCTION-ASSOCIATED STEATOTIC LIVER DISEASE IN CHILDREN : AN OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.171
Yogesh Choudhary, Ayesha Ahmad, Gulnaz Nadri
View Abstract
Background: Metabolic dysfunction-associated steatotic liver disease (MASLD), previously known as Non-alcoholic fatty liver disease (NAFLD) is increasingly recognized as the most common chronic liver disease in children, closely associated with obesity and metabolic abnormalities. Early identification is essential to prevent long-term complications. Aim: The present study aimed to evaluate the prevalence of Metabolic dysfunction-associated steatotic liver disease (MASLD) in children. The primary objective was to estimate the prevalence of MASLD among children aged 5 to 15 years. The secondary objective was to assess the correlation between MASLD and body mass index (BMI), as well as its association with metabolic syndrome, in order to better understand the risk factors contributing to the disease in the pediatric population. Materials and Methods: This cross-sectional study included 100 overweight and obese children aged 5–15 years. Anthropometric measurements, biochemical parameters (ALT), and Fibro scan were used for evaluation. MASLD was diagnosed based on liver stiffness ≥5.1 kPa. Statistical analysis was performed using chi-square test and correlation analysis. Results: The overall prevalence of MASLD was 36%. A higher prevalence was observed among obese children (40.91%) compared to overweight children (26.47%). MASLD was more common in males (42.59%) than females (28.26%), though not statistically significant (p > 0.05). A significant association was found between elevated ALT levels and MASLD (p < 0.001), with a moderate positive correlation (r = 0.433). Fibro scan findings indicated that most children had no or mild fibrosis. Although none of the participants fulfilled the criteria for metabolic syndrome, components such as dyslipidemia (42.9%) and hypertension (23.8%) were present. Conclusion: MASLD is highly prevalent among overweight and obese children and is significantly associated with elevated ALT levels. Early screening and lifestyle interventions are crucial to prevent disease progression and associated metabolic complications. Keywords: Metabolic dysfunction-associated steatotic liver disease, children, obesity, BMI, ALT, Fibro scan, metabolic syndrome.
Page No: 1008-1011 | Full Text
Original Research Article
PREVALENCE OF PROBLEMATIC INTERNET USAGE AND ITS ASSOCIATION WITH DEPRESSION AMONG MEDICAL STUDENTS IN A SOUTHERN DISTRICT OF INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.172
Manju T.M., Judson Neslin J., Abinaya K., Sanju T.M., Suresh Balan K.U.
View Abstract
Background: Problematic internet use has emerged as a growing concern among medical students due to increasing reliance on digital technologies for academic and social purposes. Excessive internet use may adversely affect mental health, particularly by increasing the risk of depression. Objectives: To determine the prevalence of problematic internet usage and to assess its association with depression among medical students in a southern district of India. Materials and Methods: A cross-sectional study was conducted among 258 undergraduate medical students selected through systematic random sampling. Data were collected using a structured questionnaire that included the Internet Addiction Test (IAT) to assess problematic internet use and the Beck Depression Inventory (BDI) to evaluate depressive symptoms. Descriptive statistics, chi-square test, and multivariable logistic regression analysis were used to identify factors associated with problematic internet use. Results: The prevalence of Problematic internet use among the participants was 13.6%, while 15.9% of students were found to have depression. Poor sleep quality, spending more than four hours online per day, higher mobile data usage, type of social media platform used, and depression showed significant associations with problematic internet use. Multivariable logistic regression analysis revealed that depression, poor sleep quality, and prolonged internet use were significant independent predictors of problematic internet use. Conclusion: A considerable proportion of medical students exhibit problematic internet usage, which is significantly associated with depression and behavioral factors. Early identification and targeted interventions are necessary to promote healthy internet use and mental well-being among medical students. Key words: Problematic internet use, Depression, Mental Health, Electronic devices.
Page No: 1012-1018 | Full Text
Original Research Article
PREVALENCE OF WEAK D (Du) ANTIGEN IN Rh NEGATIVE BLOOD GROUP INDIVIDUALS – AN EXPERIENCE AT A TERTIARY BLOOD CENTRE
http://dx.doi.org/10.70034/ijmedph.2026.2.173
V. Kesava Rao, TCS Suman Kumar, V. Siva Sankara Naik, TRSN Lakshmi, P. Deepthi, K. Sreedhar Babu
View Abstract
Background: The Rh blood group system has over 50 antigens, with the D antigen being crucial for determining Rh-positive or negative status. Weak D (Du) refers to reduced expression of the D antigen, often missed by routine anti-D serum testing and requiring anti-human globulin (AHG) for detection. Identifying weak D individuals is vital to ensure safe and appropriate transfusion practices. Aim: To determine the prevalence of Du antigen among the blood donors and recipients who were found to be Rh - negative in conventional grouping. Materials and Methods: This prospective study, conducted from May 2024 to April 2025 at a tertiary blood centre, tested Rh-negative donor and patient samples for the Du antigen. ABO and Rh typing were performed using the tube method, and Rh-negative samples were further tested for weak D using the gel card system, with results recorded as numbers and percentages. Results: A total of 25,512 blood samples were analysed with 25,073 Rh D positive and 439 Rh D negative cases. Among the Rh D-negative, 97 (22.1%) were Group A, 124 (28.2%) Group B, 43 (9.8%) Group AB, and 175 (39.8%) Group O. Of the 439 Rh D-negative samples, 48 (10.93%) were weak D positive: 8 (8.24%) Group A, 14 (11.29%) Group B, 4 (9.30%) Group AB, and 22 (12.57%) Group O. Conclusion: Weak D prevalence was 10.93% among Rh-negative individuals. Routine weak D testing is essential for accurate typing; donors considered Rh-positive and recipients Rh-negative to prevent alloimmunization and ensure transfusion safety. Keywords: Antigens, Blood Group, Blood Transfusion, Hemagglutination Tests, Rh-Hr Blood-Group System.
Page No: 1019-1023 | Full Text
Original Research Article
OCCURRENCE AND CLINICAL PROFILE OF SCRUB TYPHUS IN PATIENTS WITH ACUTE FEBRILE ILLNESS IN A TERTIARY CARE CENTER IN SOUTH INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.174
Soumya Kaup, Roopashree Srinivas, G K Megha, Swarupa Rani N M
View Abstract
Background: Scrub typhus is a zoonotic infection with varied clinical presentation and is one of the important causes of Acute Undifferentiated Febrile Illness (AUFI). Increased awareness regarding the presentation and complications of the disease facilitates appropriate and timely management of the cases. The study was conducted with the aim to estimate the occurrence of scrub typhus among patients with acute febrile illness, to characterise the demographic and clinical profile of the positive cases, and to identify factors associated with severe disease. Materials and Methods: This descriptive observational study included 200 patients aged 18 years and above who were admitted with a history of acute febrile illness from August 2024 to July 2025. The patients were tested for scrub typhus by the Weil-Felix test and positive results were confirmed by detecting IgM antibodies. Scrub typhus cases were further characterised by their demography, clinical presentation and laboratory parameters. Patients showing features suggestive of organ involvement were characterised as severe scrub typhus. The chi-square test and Fisher’s exact test was used to assess the association between categorical data and disease severity. A two-tailed p value of <0.05 was considered statistically significant for all outcomes. Results: The proportion of scrub typhus was found to be 12%, with the majority of the cases being reported during the monsoon months of August and September. The mean age of the patients was 38.2 ± 18.3 years, with a predominance of female patients and in rural areas. The most common symptom was fever, followed by abdominal pain, cough, headache, vomiting, rash, eschar, loose stools, myalgia and breathlessness. Systemic examination findings included oedema, splenomegaly, lymphadenopathy and hepatomegaly. The most common abnormality in laboratory parameters was thrombocytopenia seen in 75% of the cases, followed by raised transaminases, raised serum bilirubin, leucopenia, leucocytosis and raised serum creatinine levels. The most common complication observed was hepatitis, followed by multiorgan dysfunction syndrome, acute kidney injury, shock, acute respiratory distress syndrome (ARDS) and meningoencephalitis. No mortality was observed in this set of scrub typhus cases. Patients with severe disease had significantly lower platelet counts (p=0.011) and higher serum aminotransferases (p < 0.001) compared to mild cases of scrub typhus. Conclusion: Our study reiterates the demographic and clinical profile of scrub typhus and identifies factors associated with severe disease. Early diagnosis and prompt initiation of appropriate antibiotic therapy can prevent the development of severe complications and mortality. Keywords: Scrub Typhus, Acute febrile illness, Orientia tsutsugamushi, clinical profile.
Page No: 1024-1030 | Full Text
Original Research Article
ROLE OF STEREOTACTIC BODY RADIOTHERAPY / STEREOTACTIC RADIO SURGERY IN VARIOUS METASTATIC SITES – AN OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.175
Alladi Radhakrishnan Sharath Chandran, Sowjanya Kondru, Jinto T Joy, Malladi Ramakrishna
View Abstract
Background: Stereotactic radiosurgery (SRS) and stereotactic body radiotherapy (SBRT) have emerged as precise and effective modalities for the management of metastatic lesions in the brain and liver. This study aimed to evaluate treatment outcomes, dosimetric parameters, and toxicity profiles associated with these techniques. Materials and Methods: This observational study included 12 patients treated between May 2023 and February 2025 . Ten patients underwent SRS or fractionated SRS for brain metastases, while two patients received SBRT for liver metastases. Treatment response was assessed at 3 months using RECIST/PERCIST criteria. Dosimetric indices, toxicity profiles (graded as per CTCAE), and correlations between treatment variables and outcomes were analyzed. Results: The mean age of patients was 54.5 ± 10.0 years with equal gender distribution. Brain metastases were the most common site (75%). Fractionated SRS demonstrated a trend toward improved tumor control and reduced toxicity compared to single-fraction SRS, particularly in larger lesions. Dosimetric evaluation showed optimal conformity index (mean 1.02) and acceptable gradient indices, indicating high-quality treatment planning.At 3 months, disease control (complete/partial response and stable disease) was achieved in 75% of patients. Most patients experienced only grade 1–2 toxicities, with no grade 3 or higher adverse events observed. Correlation analysis revealed moderate associations between dose per fraction and toxicity, and between number of lesions and tumor control; however, these findings were not statistically significant. SBRT for liver metastases demonstrated acceptable dosimetric and clinical outcomes, although interpretation was limited by small sample size. Conclusion: SRS and SBRT are effective and well-tolerated treatment modalities for metastatic brain and liver lesions, offering favorable tumor control with minimal toxicity. Fractionated SRS may provide advantages in selected patients with larger or critically located lesions. These findings support individualized stereotactic treatment approaches, although larger prospective studies are required to validate these results. Keywords: Stereotactic radiosurgery; SRS; SBRT; brain metastases; liver metastases; dosimetric indices; toxicity; tumor control.
Page No: 1031-1040 | Full Text
Original Research Article
PHARMACOKINETIC AND TUMOUR-TARGETED DRUG DELIVERY MODULATION OF CHEMOTHERAPEUTIC AGENTS IN CONCURRENT CHEMORADIATION FOR IRRADIATED HYPOPHARYNGEAL SQUAMOUS CELL CARCINOMA
http://dx.doi.org/10.70034/ijmedph.2026.2.176
Amirunisa Begum, M. Sreedhar Rao, Uma Devi
View Abstract
Background: Hypopharyngeal squamous cell carcinoma (HPSCC) often presents at an advanced stage and is associated with poor functional and oncological outcomes. Concurrent chemoradiation (CCRT) has emerged as a standard organ-preserving modality; however, variability in pharmacokinetics and tumour-targeted drug delivery influences therapeutic efficacy and toxicity profiles. Aims and Objective: To evaluate pharmacokinetic modulation and tumour Ur-targeted delivery of contemporary chemotherapeutic agents in patients undergoing CCRT for irradiated HPSCC, and to compare clinical outcomes with patients receiving non-concurrent treatment. Materials and Methods: This prospective observational study included 67 patients with histologically confirmed HPSCC undergoing CCRT, compared with 23 patients receiving sequential or non-concurrent therapy. Demographic variables (age, gender, socioeconomic status), duration of hospital stay, and treatment-related parameters were recorded. Chemotherapeutic agents included cisplatin, carboplatin, 5-fluorouracil, docetaxel, and targeted agents such as cetuximab. Drug delivery technologies evaluated comprised nanoparticle-based carriers, liposomal formulations, and receptor-targeted monoclonal antibodies. Pharmacokinetic parameters, metabolism pathways (hepatic cytochrome P450 involvement, renal clearance), and adverse effects (mucositis, nephrotoxicity, haematological toxicity) were analyzed. Functional outcomes were assessed using dysphagia scoring, Voice Handicap Index (VHI), and radiation-related complications. Dropouts (n=4) and mortality (n=2) were documented. Data were analyzed using descriptive and comparative statistics. Results: The mean age of patients was 56.8 ± 9.4 years, with a male predominance (64%). Patients undergoing CCRT demonstrated improved tumour response rates and better functional preservation compared to the non-concurrent group. Advanced drug delivery systems showed enhanced tumour targeting with reduced systemic toxicity, particularly with liposomal and targeted therapies. Mean hospital stay was longer in the CCRT group (9.2 ± 3.6 days) compared to controls (6.1 ± 2.8 days). Adverse effects were more frequent in the CCRT group but were manageable. Functional outcomes showed significant improvement, with reduced dysphagia scores and better VHI outcomes in the CCRT group at follow-up. Overall treatment success rate was 86.45%, with lower recurrence rates observed in the CCRT cohort. Conclusion: Pharmacokinetic optimization and tumor-targeted drug delivery significantly enhance the efficacy of concurrent chemoradiation in HPSCC, improving functional outcomes and disease control despite increased but manageable toxicity. Advanced delivery technologies represent a promising direction in minimizing systemic adverse effects while maximizing tumor specificity. Early integration of such approaches, along with structured monitoring of functional outcomes, can substantially improve overall prognosis and quality of life in patients with hypopharyngeal carcinoma. Keywords: Carcinoma, Hypopharynx, CCRT, VHI, HPSCC and Drug delivery technology.
Page No: 1041-1046 | Full Text
Original Research Article
DIAGNOSTIC UTILITY OF IMMUNOHISTOCHEMISTRY IN SMALL ROUND BLUE CELL TUMORS: AT A TERTIARY CARE INSTITUTION
http://dx.doi.org/10.70034/ijmedph.2026.2.177
K. Umamaheswari, R. Raj Kumar, G. Ilanthendral
View Abstract
Background: Small round blue cell tumors (SRBCTs) of bone and soft tissues constitute a heterogeneous group of malignant neoplasms characterized by overlapping histomorphological features, posing significant diagnostic challenges on routine hematoxylin and eosin sections. Accurate classification is essential for appropriate therapeutic management and prognostication. Immunohistochemistry (IHC) plays a crucial role in differentiating these entities. Materials and Methods: This retrospective descriptive study was conducted in our instititution, Department of Pathology from January 2023 to December 2023. Formalin-fixed, paraffin-embedded tissue blocks of tumors diagnosed histomorphologically as small round cell tumors on biopsy and resection specimens were retrieved from archival records. Only tumors arising in bone and soft tissues were included, while tumors involving bone marrow, spleen, and lymph nodes were excluded. Bony specimens were decalcified prior to routine processing. Immunohistochemistry was performed using a panel comprising CD45/LCA, CD20, CD3, CD99 (MIC2), Desmin, epithelial membrane antigen (EMA), Cytokeratin (CK), Synaptophysin, Chromogranin, and glial fibrillary acidic protein (GFAP). A total of 11 cases fulfilling inclusion criteria were analyzed. Results: Eleven cases initially diagnosed as small round cell tumors were evaluated. The cohort comprised 6 cases of Ewing sarcoma/primitive neuroectodermal tumor (PNET), 2 cases of cutaneous T-cell lymphoma, 1 case of alveolar rhabdomyosarcoma, and 1 case of Wilms tumor. One case was reclassified as an inflammatory lesion following immunohistochemical analysis. Immunohistochemistry enabled definitive diagnosis and categorization in all cases and was essential in distinguishing morphologically overlapping entities. The majority of patients were in the 60–70-year age group, with Ewing sarcoma being the most frequent tumor in the present series. Conclusion: Immunohistochemistry is an indispensable adjunct to histopathology in the accurate diagnosis and subclassification of small round cell tumors of bone and soft tissues. A comprehensive antibody panel facilitates definitive differentiation among morphologically similar lesions and guides appropriate clinical management. Keywords: Small round blue cell tumor; Immunohistochemistry; Ewing sarcoma; Primitive neuroectodermal tumor; Rhabdomyosarcoma.
Page No: 1047-1052 | Full Text
Original Research Article
A STUDY IN MEDICAL STUDENTS TO EVALUATE THEIR REASONS FOR SEEKING PSYCHIATRIC CARE, PERCEIVED BARRIERS TO ACCESS THIS CARE, AND THEIR EXPERIENCES RELATED TO CARE RECEIVED
http://dx.doi.org/10.70034/ijmedph.2026.2.178
Ganvit Bharvi S, Darji Krishna J, Desai Nimisha D
View Abstract
Background: Medical training itself is quite stressful leading to catastrophic consequences and taking toll on mental health of medical students. Aim & Objectives: To evaluate patterns of mental health problems in medical students seeking Psychiatric care, perceived barriers to access to care, and their feedback related to psychiatric consultation. Materials and Methods: A descriptive, cross-sectional, web-based survey, of medical students who had accessed psychiatric consultation (2018-2023) was conducted. They were contacted telephonically (n=71) and invited to take part in survey. Their Psychiatry consultations details were noted, perceived barriers were assessed through BACE v3 (Barriers to access to care evaluation) scale, and their feedback about consultation, was collected through google form. Their current mental health status was also reviewed through asking them to rate themselves on PHQ-9 (Patient Health Questionnaires-9) and GAD-7(Generalized Anxiety disorder-7). Results: The reasons for consultation were anxiety disorders (63%), Depression (21%), Bipolar mood disorder (9%), substance use (5.6%) and Schizophrenia (1.4) and for majority it took >4 months to reach out for help as family (71%) or friends (42%) were not supportive. 67% of students reported recovery from past mental health issues, and 23.9% of students are on treatment. On evaluating perceived barriers, nearly 75% had fear that label of mental health issues may adversely influence their career and 32% of them believed that people will judge them. 91.5% were satisfied with treatment provided and 89.8% appreciated the attitude of treating consultant during their visit to psychiatric OPD. Currently prevalence of depression was 28% (PHQ score>/=10) and of anxiety was 29.5% (GAD score>/= 10). Conclusion: Due to perceived barriers related to psychiatry consultation, students took long time to seek help for their mental health issues but they were satisfied and recovered with psychiatry consultation. Keywords: Medical Students, Mental Health Problems, Perceived Barriers, Psychiatry Consultation.
Page No: 1053-1058 | Full Text
Original Research Article
PREVALENCE AND SEASONAL DISTRIBUTION OF INTESTINAL PROTOZOAN INFECTIONS IN DUHOK CITY, KURDISTAN REGION, IRAQ
http://dx.doi.org/10.70034/ijmedph.2026.2.179
Haneen Nawaf AlZainny, Salwa Muhsin Hasan, Iman A. Hami
View Abstract
Background: Intestinal infections remain among the most prevalent diseases caused by protozoan parasites. These intestinal protozoa can lead to a range of clinical symptoms, from chronic to severe diarrhea, stomach cramps, flatulence, nausea, vomiting, loss of appetite, fatigue, low-grade fever, and weight loss. School-aged children (5–17 years) are particularly vulnerable, often exhibiting higher infection rates due to inadequate hygiene practices and frequent contact with contaminated environments, such as soil during outdoor activities. Materials and Methods: In the current retrospective study, data were obtained from the archives of the parasitology section of the Laboratory Department at Azadi Teaching Hospital and Heevi Pediatric Hospital in Duhok city. The study included positive cases of intestinal parasitic infections recorded between January 2024 and December 2024, totaling 398 patients. These pre-existing stool specimens had been examined macroscopically, followed by microscopic analysis using the wet mount technique. Results: The findings revealed that infection rates were nearly similar between males and females, with 211 (53%) males and 187 (47%) females. Adults exhibited a higher infection rate compared to children, with 247 (62%) and 151 (38%) cases, respectively. Entamoeba histolytica was the most frequently detected parasite, present in 388 (97%) of cases. Other detected parasites included Giardia lamblia in 8 cases (2%), Trichomonas hominis in 1 case (0.25%), and Blastocystis hominis in 1 case (0.25%). Conclusion: Seasonal distribution of infection showed the highest prevalence during the summer, with 177 cases (44.5%), followed by autumn with 123 cases (30.9%), spring with 60 cases (15%), and winter with 38 cases (9.5%). Keywords: Intestinal protozoa, Seasons, Wet mount, Parasite, Temperature, technique.
Page No: 1059-1064 | Full Text
Original Research Article
EFFECT OF CHRONIC PAIN ON THE DAILY LIFE OF PATIENTS RECEIVING TREATMENT FROM A PAIN CLINIC IN A TERTIARY CARE CENTRE IN SOUTH INDIA - A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.180
Aswini Lakshminarasimhan, Aarcha S Nair
View Abstract
Background: Chronic pain is a complex condition affecting multiple domains of daily life, with limited region-specific evidence from South India. This study assessed the association between pain intensity and daily life interference and evaluated the influence of treatment modalities and patient characteristics. Materials and Methods: A serial cross-sectional observational study was conducted among adult chronic pain patients (≥3 months) attending a tertiary care centre. Pain intensity and interference were assessed using the Visual Analogue Scale (VAS) and Brief Pain Inventory (BPI). Statistical analysis included Spearman’s correlation and multivariable linear regression adjusting for confounders. Results: Among 73 participants (mean age 51.7±11.8 years; 70% female), baseline median VAS score was 7 (IQR 5–8). A strong positive correlation between pain severity and interference was observed before treatment (ρ=0.76, p<0.001) and after treatment (ρ=0.87, p<0.001). Mean BPI scores significantly decreased from 5.4±1.8 to 3.5±1.8 (p<0.001). Pain intensity measures also improved: average pain reduced from 5.8±2.0 to 3.9±2.0 (p<0.001). Opioid-only therapy showed greater median pain relief (2.29, IQR 1.68) compared to non-opioid therapy (1.07, IQR 1.73; p=0.0117). Pain intensity independently predicted interference (β=0.639, 95% CI: 0.517–0.761; p<0.001). Malignant conditions showed poorer response (β=–0.970, p=0.002). Conclusion: Chronic pain significantly impairs daily functioning, with pain intensity strongly predicting interference. Combined and opioid-based therapies provide superior relief, supporting multimodal management approaches. Keywords: Chronic Pain, Pain Measurement, Quality of Life, Analgesics, Opioid, Cross-Sectional Studies.
Page No: 1065-1071 | Full Text
Original Research Article
ROLE OF TRANSVAGINAL SONOGRAPHY IN RISK STRATIFICATION OF POSTMENOPAUSAL BLEEDING: CORRELATION WITH HISTOPATHOLOGY
http://dx.doi.org/10.70034/ijmedph.2026.2.181
Priyanka Yadav, Sachin Raiya, Sahil Yadav
View Abstract
Background: Postmenopausal bleeding is a significant clinical concern due to its association with endometrial pathology, including malignancy. Transvaginal sonography serves as a non-invasive tool for assessing endometrial thickness and guiding further management. Materials and Methods: This cross-sectional study included 226 postmenopausal women presenting with bleeding at a tertiary care center. All participants underwent transvaginal sonography for measurement of endometrial thickness, followed by histopathological evaluation where indicated. Diagnostic performance of different endometrial thickness cut-offs was assessed using statistical analysis and ROC curve. Results: The mean endometrial thickness was 8.6 ± 4.3 mm. Benign lesions were most common (74.3%), with endometrial atrophy predominating (46.0%), while premalignant and malignant lesions accounted for 16.8% and 8.9%, respectively. A significant association was observed between increasing endometrial thickness and pathological severity (p < 0.001). At a cut-off of ≥4 mm, sensitivity and negative predictive value were 100%, whereas a threshold of 8.5 mm improved specificity (78.0%) with an AUC of 0.89. Conclusion: Transvaginal sonography is a reliable first-line modality in postmenopausal bleeding. Endometrial thickness effectively stratifies risk, with lower thresholds excluding malignancy and higher thresholds improving diagnostic precision. Keywords: Postmenopausal bleeding, Endometrial thickness, Transvaginal sonography, Endometrial carcinoma, ROC curve.
Page No: 1072-1077 | Full Text
Original Research Article
COMPARATIVE OUTCOMES OF SUBMENTAL AND ESTLANDER FLAPS IN POST-ONCOLOGIC ORAL COMMISSURE RECONSTRUCTION
http://dx.doi.org/10.70034/ijmedph.2026.2.182
Supriya Goroba Gaware, Manu S Babu
View Abstract
Background: This study compares the submental and Estlander flaps, both commonly used for oral cavity reconstruction, to evaluate their efficacy, limitations, and suitability for various oral commissure defect morphologies, focusing on functional and aesthetic outcomes. Materials and Methods: A prospective study was conducted from June 2022 to June 2025 at a single tertiary care center, including 30 cases for each flap type (submental and Estlander). Inclusion criteria involved patients needing reconstruction for oncological resections, while exclusions covered prior radiation or significant comorbidities. Results: Baseline demographics were comparable between the cohorts. Submental flap procedures averaged 180 minutes. Minor complications, such as partial flap necrosis, occurred in 9.09% of submental artery perforator flap cases and 8.16% of SIF with anterior belly of DM cases. The submental flap's abundant tissue supply and robust vascularity make it reliable for larger and more complex defects, often comparable to free tissue flaps. However, its intricate vascular anatomy requires meticulous dissection to prevent complications. The Estlander flap provides excellent aesthetic results for smaller defects but lacks the tissue volume for extensive reconstructions. Surgical expertise and the need for specific tissue characteristics also influence the final choice. Conclusion: The decision between submental and Estlander flaps is highly individualized, depending on defect size, location, patient age and comorbidities, surgeon expertise, and functional and aesthetic considerations. Keywords: Oral commissure reconstruction, Submental flap, Estlander flap, Head and neck surgery, Flap survival, Complications, Aesthetic outcomes, Functional outcomes.
Page No: 1078-1083 | Full Text
Original Research Article
STUDY ON THE KNOWLEDGE, ATTITUDE AND PRACTICE REGARDING SAFE FOOD HANDLING AMONG PRIMARY FOOD HANDLERS IN RURAL AREAS OF SIVAKASI, TAMILNADU
http://dx.doi.org/10.70034/ijmedph.2026.2.183
Anbu M, R.Uma Maheswari, Thirumalai Kumar
View Abstract
Background: Food safety means assurance that food is acceptable for human consumption according to its intended use, while everyone is exposed to foodborne health risks it is the poor who are most exposed and vulnerable to these risks. According to WHO, the five key principles of food hygiene are Preventing pathogens from people, pets, and pests from contaminating food, keeping raw and cooked foods separate to avoid cross-contamination, cook foods for the appropriate amount of time and temperature to kill pathogens, keep food at the appropriate temperature, Use only clean water and raw materials. The objective is to assess the knowledge, Attitude and Practice of safe food handling among the households in the rural areas of Sivakasi, Tamilnadu. Materials and Methods: It was cross-sectional study done in rural areas of Sivakasi, from June 2022 to January 2023, the sample size was 178, multi-stage random sampling was adopted in 5 villages. The interviewer administered a Semi-structured questionnaire for collecting socio-demographic details and Knowledge, Attitude and Self Practice regarding safe food handling. Results: Among the 179 study participants the safe food handling by the primary food handler’s knowledge was 78.8%, Attitude was 75.4% and self-reporting practice was 67.4%. respectively. Self-Practice towards food safety was not adequate. Such non-compliance could result in outbreaks of foodborne illness. Therefore, is an urgent need to raise interest in food safety. Conclusion: Among the 179 study participants the safe food handling by the primary food handler’s knowledge was 78.8 %, Attitude was 75.4% and but Self-reporting practice was only 67.4%. “Knowledge is of no value, unless you put it into practice”, this study demonstrates that the adequate Knowledge, Attitude and but self-Practice towards food safety was not adequate. Such non-compliance could result in outbreaks of foodborne illness. Therefore, is an urgent need to raise interest in food safety. The better female educational status would improve the KAP on food safety measures. Media campaigns pertaining to food safety measures may be organized as they provide an excellent opportunity for such information to be received by a large number of consumers including those at home. Knowledge needs to be sustained about the WHO’s five key principles of food hygiene, which are keep clean, separate raw and cooked food, cook thoroughly, keep food at safe temperatures and use safe water. Keywords: Primary Food handlers, Personal hygiene practices, Safe food handling, Cross-sectional study, KAP.
Page No: 1084-1089 | Full Text
Original Research Article
DELAYED SECONDARY REPAIR FOR OBSTETRIC ANAL SPHINCTER INJURIES (OASIS): SINGLE CENTRE EXPERIENCE
http://dx.doi.org/10.70034/ijmedph.2026.2.184
Arun Jenagavel S, Prabhakaran A, Karthikeyan M
View Abstract
Background: Obstetric anal sphincter injuries (OASIS) are a significant complication of childbirth, often resulting in fecal incontinence and reduced quality of life. This study evaluates outcomes of delayed secondary repair of OASIS with various sphincteroplasty techniques at a single center. The case series demonstrates significant symptomatic improvement and high patient satisfaction, supporting the use of delayed repair in missed or inadequately treated injuries. This case series aimed to evaluate the clinical outcomes of sphincteroplasty by various techniques in patients presenting with delayed or missed OASIS. Materials and Methods: The study included 8 female patients treated at a single institution over 3 years. All participants had a history of instrumental delivery and episiotomy, with symptoms of incontinence. Preoperative and postoperative assessments were performed using the Wexner scoring system. Overlapping sphincteroplasty with Martius flap was performed in 4 cases, Direct apposition sphincteroplasty was performed in 2 cases and Overlapping sphincteroplasty was performed in 2 cases. Follow-up was conducted at 3 months, 6 months, 1 year, and 2 years to evaluate symptomatic improvement and patient satisfaction. Results: The mean age of the participants was 32 years (range: 22–45 years). All patients experienced symptomatic improvement following surgery, with significant reductions in the Wexner scores and high patient satisfaction rates. One patient had superficial infection settled by antibiotics and conservative management. No incontinence symptoms were reported at regular follow-up. Conclusion: Delayed sphincteroplasty is an effective surgical approach for treating missed or inadequately repaired OASIS, offering satisfactory outcomes. However, prevention through improved obstetric practices remains an optimal strategy. Keywords: Incontinence, OASIS, Wexner score, Sphincteroplasty.
Page No: 1090-1095 | Full Text
Original Research Article
PREVALENCE OF PULMONARY TUBERCULOSIS IN HEALTH CAMPS, DOOR TO DOOR SURVEYS, CONTACTS OF INDEX CASES AND AT NIKSHAY DIWAS IN TUBERCULOSIS UNIT OF RAMA MEDICAL COLLEGE MANDHANA KANPUR
http://dx.doi.org/10.70034/ijmedph.2026.2.185
Sunny Kumar, Ashish Shukla, Rishabh Gupta
View Abstract
Background: Pulmonary tuberculosis (TB) is a major public health concern, especially in developing countries. Early detection is critical for effective treatment and reducing transmission. Community-based approaches such as health camps and door-to-door surveys can aid in identifying undiagnosed cases. Objective: To determine and compare the prevalence of pulmonary TB identified through health camps and door-to-door surveys in a selected population. Materials and Methods: A cross-sectional study was conducted in TB Unit located at Rama Medical College-Hospital and Research Centre, situated in Kanpur Nagar. Screening was performed in two settings: health camps and direct household visits. Results: A total of 207 index cases were screened. Door-to-door surveys identified more TB cases compared to health camps. Higher prevalence was observed in males aged 25–50 years. Common risk factors included smoking, undernutrition, and past TB history. Conclusion: There is a significant hidden burden of pulmonary TB in the community. Door-to-door surveys were more effective in identifying TB cases than health camps. Community-based active case finding should be integrated into national TB control programs. Keywords: Pulmonary tuberculosis (TB), Community-based approaches, Index Cases, Nikshay Diwas.
Page No: 1096-1100 | Full Text
Original Research Article
IMPACT OF RADIOTHERAPY-INDUCED FIBROSIS ON DIFFICULT AIRWAY PREDICTORS IN ORAL CANCER SURGERY: A PROSPECTIVE OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.186
Sana Qureshi, Mohammed Saif Niyazi
View Abstract
Background: Radiotherapy is an essential component of multimodal treatment for oral cavity malignancies. However, radiation exposure frequently leads to progressive fibrosis of soft tissues in the head and neck region, which may alter airway anatomy and complicate airway management during anaesthesia. The present study aimed to evaluate the effect of radiotherapy-related fibrosis on commonly used predictors of difficult airway in patients undergoing surgery for oral cancer. Materials and Methods: This prospective observational study included 80 adult patients scheduled for elective oral cancer surgery under general anaesthesia. Patients were divided into two groups: those who had previously received radiotherapy (RT group, n=40) and those without prior radiotherapy (Non-RT group, n=40). Preoperative airway assessment included Mallampati classification, thyromental distance, inter-incisor distance, sternomental distance, and neck extension. Laryngoscopic view during intubation was graded using the Cormack–Lehane classification. The incidence of difficult laryngoscopy was compared between groups. Results: Patients who had undergone radiotherapy demonstrated significantly higher Mallampati grades, decreased mouth opening, shorter thyromental distance, and restricted neck mobility compared with those without previous radiation exposure (p < 0.05). Difficult laryngoscopy (Cormack–Lehane grade III or IV) occurred in 35% of patients in the RT group compared with 10% in the Non-RT group. Reduced thyromental distance and limited cervical mobility showed the strongest association with difficult laryngoscopy. Conclusion: Radiotherapy-related fibrosis significantly influences airway anatomy and increases the frequency of predictors associated with difficult airway. Careful airway evaluation and preparation for advanced airway management techniques are essential when anaesthetising oral cancer patients with prior radiation exposure. Keywords: Oral cancer, radiotherapy fibrosis, difficult airway, airway predictors, anaesthesia.
Page No: 1101-1103 | Full Text
Original Research Article
PROSPECTIVE STUDY OF FUNCTIONAL OUTCOME OF DISPLACED MIDDLE THIRD CLVICULARFRACTURES TREATED BY PLATE OSTEOSYNTHESIS
http://dx.doi.org/10.70034/ijmedph.2026.2.187
N. Navya Tez, L. Manohar Reddy, K. Sudhakar, T. Prudhvi Raj
View Abstract
Background: Aim: To know the functional outcomes of surgical management of middle third displaced clavicle fracture by ORIF with locking compression plate. Materials and Methods: It was a prospective Interventional study. The current study was conducted in the Department of Orthopaedics, NRI Institute of Medical Sciences, Visakhapatnam, Andhra Pradesh, India. The study was conducted during the period from 16 months from August 2022 – November 2023. It was a prospective Interventional study The study was interventional, as surgery was done to patients – that is treatment of middle 1/3rd of clavicular fractures using plate method. After getting approval from the Ethics Committee, patients admitted at Orthopaedics ward at NRIIMS, Sangivalasa, Visakhapatnam with fractures of Middle 3rd clavicle were taken as study sample. Results: The current study was done on 30 patients with displaced middle 1/3rd of clavicular fractures. 30% of the patients were aged 41 to 50 years, 26.67% were aged 21 to 30 years.90% of subjects were males. 96.67% had road traffic accident. 63.33% underwent surgery for 60-70 mins. 50% had right sided fracture of clavicle. 60% had 2C AO/OTA of fracture of clavicle. 26.67% had associated minor injuries. 16.67% had associated RIB fractures Fracture union time was 11 to 12 weeks for 36.67% of patients. 3.33% had non-union. Mean DASH score is 15.2 at 3months. 10% developed infection postoperatively. 10% had more than 7 days of hospital stay. There is significant association between age & infection. Patients aged above 50 years developed infection due to co morbid conditions like DM. There is a significant association between age & duration of hospstay, as they were treated with IV antibiotics for superficial infection, so the duration of hospital stay was prolonged. Conclusion: The current study was done on 30 patients with middle 1/3rd of clavicular fractures. This interventional study concluded that use of locking plates for displaced midshaft clavicle fractures results in union with very good functional outcome and is associated with low complication rates. All the patients were discharged in stable condition. A Interventional randomized prospective study is suggested to prove the superiority of operative management over conservative treatment. Keywords: Clavicle fracture, ORIF, locking compression plate, DASH, Functional outcome.
Page No: 1104-1110 | Full Text
Original Research Article
ALTERATIONS IN COAGULATION PARAMETERS IN PREGNANCY INDUCED HYPERTENSION AND THEIR CORRELATION WITH SEVERITY OF PREECLAMPSIA
http://dx.doi.org/10.70034/ijmedph.2026.16.2.188
Rajkumar R., J. Maheswari, A. Arputham
View Abstract
Background: Pregnancy induced hypertension (PIH) is a major cause of maternal and perinatal morbidity and mortality. It is associated with endothelial dysfunction and activation of the coagulation system, leading to alterations in hemostatic parameters. Early identification of these changes is crucial for timely management and prevention of complications. Aim to evaluate alterations in coagulation parameters in pregnancy induced hypertension and correlate them with the severity of preeclampsia. Materials and Methods: This prospective case-control study was conducted on 200 pregnant women, including 100 normotensive controls and 100 women with PIH. Coagulation parameters including platelet count, bleeding time (BT), clotting time (CT), prothrombin time (PT), and activated partial thromboplastin time (APTT) were assessed. The PIH group was further categorized into mild preeclampsia, severe preeclampsia, and eclampsia. Statistical analysis was performed using appropriate tests, and p-value <0.05 was considered significant. Results: PIH patients showed a significant reduction in platelet count and prolongation of BT, CT, PT, and APTT compared to normotensive women (p < 0.001). The prevalence of coagulation abnormalities such as thrombocytopenia and prolonged coagulation times was significantly higher in PIH cases. A strong correlation was observed between worsening coagulation parameters and increasing severity of preeclampsia, with progressive deterioration from mild to severe cases and eclampsia (p < 0.001). Conclusion: Coagulation parameters are significantly altered in PIH and correlate strongly with disease severity. Routine assessment of coagulation profile can aid in early diagnosis, risk stratification, and prevention of complications, thereby improving maternal and fetal outcomes. Keywords: Pregnancy Induced Hypertension. Coagulation Parameters. Preeclampsia Severity.
Page No: 1111-1116 | Full Text
Original Research Article
USING CINE MAGNETIC RESONANCE IMAGING TO EVALUATE THE DEGREE OF INVASION IN MEDIASTINAL MASSES
http://dx.doi.org/10.70034/ijmedph.2026.2.189
Ramakanth Veluru, Ankammarao Tanniru, Johny Prasad Bollipo
View Abstract
Background: Mediastinal masses present diagnostic challenges due to their potential cardiovascular invasion. Conventional imaging modalities like CT and echocardiography have limitations in assessing cardiovascular involvement. Cine magnetic resonance imaging (cine MRI) is emerging as a valuable tool for evaluating mediastinal masses' impact on nearby cardiovascular structures. This study explores the utility of cine MRI in assessing cardiovascular invasion by mediastinal masses. Materials and Methods: A retrospective analysis of patients with mediastinal masses referred for evaluation between January 1, 2020, and December 31, 2021, was conducted. Inclusion criteria encompassed patients with mediastinal masses confirmed by CT and/or histopathology. All eligible patients underwent cine MRI using a Siemens Magnetom Avanto 1.5T MRI scanner. Cine MRI sequences, including axial, coronal, and sagittal views, visualized the dynamic relationship between the mediastinal mass and adjacent cardiovascular structures. Cardiovascular invasion was assessed by experienced radiologists using cine MRI findings and correlated with surgical or histopathological results. Results: A total of 68 patients with mediastinal masses were included in the study. Cine MRI demonstrated a sensitivity of 92.3%, specificity of 87.5%, positive predictive value of 85.7%, and negative predictive value of 93.8% in detecting cardiovascular invasion by mediastinal masses. Among the cases with confirmed cardiovascular invasion, cine MRI accurately delineated the extent of involvement, with a mean overestimation of 1.2 cm and a mean underestimation of 0.8 cm compared to surgical or histopathological findings. The interobserver agreement between radiologists for cine MRI assessment was substantial, with a kappa value of 0.82. Conclusion: Cine MRI proves to be a valuable imaging modality for assessing cardiovascular invasion by mediastinal masses. It offers high sensitivity and specificity, aiding in accurate detection and extent delineation. Cine MRI's dynamic visualization enhances its utility in surgical planning and clinical management decisions. In cases of mediastinal masses with suspected cardiovascular invasion, cine MRI should be considered an essential diagnostic tool. Keywords: cine MRI, mediastinal masses, cardiovascular invasion, imaging, sensitivity, specificity, surgical planning, dynamic visualization.
Page No: 1117-1120 | Full Text
Original Research Article
RETROSPECTIVE COMPARATIVE STUDY OF LAPAROSCOPIC AND OPEN APPROACH OF FECAL DIVERSION IN LOCALLY ADVACNED CARCINOMA RECTUM
http://dx.doi.org/10.70034/ijmedph.2026.2.190
Mohamed Yasar M Y, Prabhakaran A, Karthikeyan M
View Abstract
Background: Locally advanced rectal cancer is commonly managed with neoadjuvant chemoradiotherapy to downstage the disease, often necessitating diversion procedures for obstruction, incontinence, or fistula. This study aimed to compare the laparoscopic and open methods for faecal diversion performed before the initiation of NACRT or Curative surgery. Materials and Methods: This retrospective study included 52 patients diagnosed with biopsy-proven locally advanced rectal carcinoma (> T3N1) at Tirunelveli Medical College between January 2022 and April 2025. All included patients were treated with curated intent and were divided into two groups: laparoscopic faecal diversion (n=38) and open faecal diversion (n=14). Depending on the future surgery plan, those planned for LAR underwent transverse loop colostomy and those planned for APR underwent sigmoid loop colostomy. Results: The mean patient age was 56 and 54 years in the laparoscopic and open groups, respectively (p=0.5). Obstruction was the primary indication for diversion in both groups, with similar rates (82.4% vs. 82%, p=0.56). In the laparoscopic group, 24 patients underwent transverse loop colostomy and 14 underwent sigmoid loop colostomy, compared to eight and six patients in the open group respectively. The laparoscopic group showed significantly faster initiation of NACRT (7.43 vs. 20.43 days, p=0.001), earlier oral intake (1.08 vs. 2.8 days, p=0.02), and shorter hospital stay (2.5 vs. 6 days, p=0.001). Stage migration was more frequent in the laparoscopic surgery group (32% vs. 18%, p=0.003). Conclusion: Laparoscopic diversion enables early NACRT initiation, faster recovery, and fewer complications while improving stoma creation and disease assessment with better anatomical visualisation. Keywords: Locally advanced rectal carcinoma, Transverse loop colostomy, Sigmoid loop colostomy, Disease downstaging, Stoma creation.
Page No: 1121-1129 | Full Text
Original Research Article
ADVERSE EFFECT OF THE SPONTANEOUS PASSAGE OF LOWER URETERIC CALCULI 5-10MM SIZE UPON USAGE OF TAMSULOSIN AND DEFLAZACORT IN COMPARISON TO USAGE OF TAMSULOSIN ALONE
http://dx.doi.org/10.70034/ijmedph.2026.2.191
Abuzar Ahmadi, Abuzar Ahmadi
View Abstract
Background: Urolithiasis remains a significant urological disorder worldwide, with lifetime prevalence estimates ranging between approximately 4 % and 15 % in various populations and a recurrence rate reaching up to 50 % in some series. The aim is to evaluate and compare the adverse effects associated with the spontaneous passage of 5–10 mm lower ureteric calculi in patients receiving combination therapy of tamsulosin and deflazacort versus tamsulosin alone. Materials and Methods: This is a prospective comparative observational study conducted over 18 months at the Department of Urology, a tertiary care centre, including 60 patients with lower ureteric calculi measuring 5–10 mm. Results: In the present study involving 119 patients, adverse effects were generally mild and comparable between the two groups. Headache was reported in 25 patients (41.67%) in Group A and 21 patients (35.59%) in Group B, with no significant difference (p = 0.496). Dizziness occurred in 21 patients (35%) from Group A and 18 patients (30.51%) from Group B (p = 0.602). Conclusion: We concluded that , the spontaneous passage rates of 5–10 mm lower ureteric calculi were comparable between patients receiving tamsulosin with deflazacort (Group A) and those receiving tamsulosin alone (Group B), demonstrating similar overall efficacy. Group A showed a significantly higher passage rate for stones located in the lower ureter compared to VUJ stones, suggesting a potential benefit of combination therapy in specific stone locations. Keywords: Urolithiasis, Tamsulosin, Deflazacort, Spontaneous stone passage and Ureteric colic.
Page No: 1130-1134 | Full Text
Original Research Article
CORRELATION BETWEEN THROMBOCYTOPENIA AND SEVERITY OF DENGUE FEVER IN ADMITTED PATIENTS IN A TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.192
Rupesh Patidar, Vivek Sullere
View Abstract
Background: Dengue fever has emerged as a major global public health concern and is currently the most important vector-borne viral disease in tropical regions. Objective: to evaluate the correlation between thrombocytopenia and severity of dengue infection among hospitalized patients in a tertiary care centre. Materials and Methods: This Hospital-based prospective observational study was conducted in the Department of General Medicine, including the medical and fever wards, at Bombay Hospital, Indore, Madhya Pradesh, India, a tertiary care teaching hospital. Results: Abnormal vital signs – low or raised temperature, raised pulse rate, hypoxia (SpO₂ 90-94%), and abnormal blood pressure – all showed significant associations with severity (p < 0.001 for most, p = 0.026 for SpO₂). SGOT (AST) was high in 41.5% and significantly associated (p < 0.001): 55.4% of those with high SGOT had severe dengue, and none with normal SGOT developed severe dengue. Similarly, SGPT (ALT) was high in 39.5% and significantly associated (p < 0.001): 53.2% of those with high SGPT had severe dengue. Hepatomegaly (29.0%), gallbladder wall edema (21.0%), ascites (20.0%), and pleural effusion (16.0%) were common findings; 49.0% had normal ultrasound. Non-severe dengue: 23.5%, dengue with warning signs: 50.5%, severe dengue: 26.0%. Conclusion: Thrombocytopenia, particularly severe (<50,000/µL), is a strong and independent predictor of severe dengue. Moderate thrombocytopenia (50,000–99,999/µL) uniformly indicates dengue with warning signs, mandating close monitoring. Keywords: Thrombocytopenia, Severity of Dengue Fever, hospitalized patients.
Page No: 1135-1141 | Full Text
Original Research Article
THORACIC SEGMENTAL SPINAL ANAESTHESIA VERSUS GENERAL ANAESTHESIA IN PATIENTS UNDERGOING LAPAROSCOPIC CHOLECYSTECTOMY: RANDOMIZED CONTROL TRIAL
http://dx.doi.org/10.70034/ijmedph.2026.2.193
Faizan Ahmad, Bhawna Singh, Mohd Khalik
View Abstract
Background: Laparoscopic cholecystectomy is the gold standard surgical treatment for symptomatic gallstone disease and other benign gallbladder pathologies. It has significantly reduced morbidity, shortened hospital stays, and improved patient outcomes compared to open cholecystectomy. The objective is to compare (Hemodynamic parameters and adverse effect) Thoracic segmental Spinal anaesthesia vs general anaesthesia in patients undergoing laparoscopic cholecystectomy. Materials and Methods: The present study was conducted on 116 patients aged between 18 to 60 years group A received General Anaesthesia and Group B received Thoracic Segmental Spinal Anaesthesia. Results: Hemodynamic stability is superior under TSSA compared to GA. Patients maintained higher systolic, diastolic, and mean arterial pressures throughout the surgery, demonstrating that the limited sympathetic blockade of TSSA prevents major cardiovascular depression. Incidence of adverse effects such as hypotension, bradycardia, and PONV is low and comparable between groups. Conclusion: Thoracic segmental spinal anaesthesia emerges as a safe, efficient, and patient-friendly alternative to general anaesthesia for laparoscopic cholecystectomy in appropriately selected patients. Keywords: Hemodynamic Parameters, Adverse Effect, Thoracic Segmental Spinal Anaesthesia, General Anaesthesia, Laparoscopic Cholecystectomy.
Page No: 1142-1148 | Full Text
Original Research Article
COMPUTED TOMOGRAPHY IMAGING FEATURES IN MODERATE TO SEVERE TRAUMATIC BRAIN INJURY AND ITS CORRELATION WITH CLINICAL OUTCOME IN TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.194
LVSSN Prasanna Pidaparti, Noorunisa Begum, Satya Suneetha Kommana, Chandrasekhar Reddy Kalakoti, Aniket Mishra
View Abstract
Background: Traumatic brain injury (TBI) is a leading cause of death and disability worldwide. Early CT imaging can identify lesions that not only guide acute management but also predict patient outcomes. Data from Indian populations with moderate to severe TBI remain limited. The aim is to evaluate the prognostic value of initial CT features in predicting mortality among patients with moderate to severe TBI. Materials and Methods: This prospective cohort study included 85 consecutive patients (GCS ≤12) with neuroparenchymal abnormalities on initial non-contrast head CT. CT features recorded included extradural hematoma, subdural hematoma, traumatic subarachnoid hemorrhage, intraventricular hemorrhage, hemorrhagic contusions, diffuse axonal injury, basal cistern status, midline shift, and herniation. Mortality and functional outcomes at discharge were assessed using the Glasgow Outcome Scale. Statistical analysis comprised chi-square/Fisher’s exact tests and multivariate logistic regression. Results: Overall mortality was 22.4%. Independent predictors of death were intraventricular hemorrhage (p=0.017), basal cistern effacement (p=0.042), midline shift >10 mm (p=0.036), diffuse axonal injury grade 3 (p=0.040), and herniation (p=0.080, borderline significance). Subdural hematoma, extradural hematoma, contusions, and traumatic subarachnoid hemorrhage were not significantly associated with mortality. Good recovery (GOS 1) occurred in 56.5% of patients. Conclusion: Baseline CT features such as intraventricular hemorrhage, basal cistern effacement, marked midline shift, high-grade diffuse axonal injury, and herniation are strong predictors of mortality in moderate to severe TBI. Keywords: Traumatic Brain Injury (TBI), Glasgow Coma Scale (GCS), Intracranial Hemorrhage, Cerebral Edema, Moderate to Severe Head Injury.
Page No: 1149-1152 | Full Text
Original Research Article
KNOWLEDGE, ATTITUDE, AND PRACTICE REGARDING PAP SMEAR SCREENING FOR CERVICAL CANCER AMONG FEMALE HEALTHCARE WORKERS IN A TERTIARY CARE TEACHING HOSPITAL IN CENTRAL INDIA: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.195
Mahananda tukaram Bele, Sangita Prafulla Kadu, Rakhi Kisanrao Suryawanshi, Snehlata R Hingway
View Abstract
Background: Cervical cancer is a major public health concern, particularly in developing countries like India, despite being preventable through effective screening methods such as the Papanicolaou (Pap) smear. Healthcare workers play a crucial role in promoting screening practices; however, gaps in their knowledge, attitude, and practice (KAP) may hinder effective implementation of preventive strategies. This study aimed to assess the KAP regarding Pap smear screening among women healthcare workers in a tertiary care teaching hospital in Central India. Materials and Methods: A cross-sectional study was conducted among 250 women healthcare workers at Dr. Rajendra Gode Medical College, Amravati, from November 2025 to January 2026. Participants were selected based on willingness to participate and provided informed consent. Data were collected using a structured and pretested questionnaire assessing sociodemographic characteristics and KAP related to cervical cancer screening. Data were analyzed using SPSS software. Descriptive statistics were applied, and the association between knowledge and practice was evaluated using the Chi-square test, with p < 0.05 considered statistically significant. Results: A high proportion of participants were aware of cervical cancer (87.2%) and Pap smear (81.6%); however, only 56.8% had adequate knowledge of risk factors, and 38.4% were aware of the correct screening interval. Although 84.8% perceived screening as necessary and 74.4% expressed willingness to undergo testing, only 32.8% had ever undergone a Pap smear. Major barriers included absence of symptoms (57.1%), negligence (52.4%), and lack of time (48.8%). A statistically significant association was observed between knowledge and screening practice (p = 0.0003). Conclusion: Despite good awareness and positive attitude, the practice of Pap smear screening among healthcare workers was suboptimal. Strengthening educational interventions, addressing barriers, and implementing institutional screening programs are essential to improve screening uptake. Keywords: Cervical cancer, Pap smear, Knowledge attitude practice, Healthcare workers, Screening uptake.
Page No: 1153-1158 | Full Text
Original Research Article
ASSESSMENT OF KNOWLEDGE, ATTITUDE, AND PRACTICES ABOUT HANDLING MEDICOLEGAL CASES AMONG HEALTHCARE PROFESSIONALS IN A TERTIARY CARE HOSPITAL: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.196
Sachin S. Sonawane, Mahendra Wankhede, Rajesh B. Sukhdeve, Shashank Singh, Sarah Al Hinnawi, Shivkumar R. Kolle, Chancey wood
View Abstract
Background: Medicolegal cases (MLCs) form an integral component of clinical practice, requiring healthcare professionals to possess adequate knowledge, appropriate attitudes, and standardized practices. However, increasing medico-legal litigation and patient awareness have highlighted gaps in medico-legal competence among healthcare providers. The aim is to assess the knowledge, attitude, and practices (KAP) regarding medicolegal cases among healthcare professionals in a tertiary care center and to identify factors associated with difficulties in handling medicolegal cases. Materials and Methods: A cross-sectional, questionnaire-based study was conducted among 206 healthcare professionals, including interns, residents, medical officers, and faculty. A structured, self-administered questionnaire assessed knowledge, attitudes, and practices related to medicolegal case handling. Data were analyzed using descriptive statistics and inferential methods, including Chi-square tests to assess associations. Results: Most participants demonstrated satisfactory basic knowledge (e.g., awareness of MLC registration 92.4%); however, gaps existed in legal provisions (60.2%), consent (66.9%), and chain-of-custody (63.1%). Positive attitudes were observed, with 80.1% acknowledging the importance of documentation, though only 54.7% felt confident handling MLCs. A significant proportion (78.8%) reported difficulty in managing MLCs, especially among less experienced professionals. A statistically significant association was found between clinical experience and difficulty faced (p < 0.001). Conclusion: Despite adequate foundational awareness, significant gaps exist in procedural knowledge and practical confidence. Structured training programs, regular continuing medical education (CME), and institutional protocols are essential to improve medicolegal competence among healthcare professionals. Keywords: Medicolegal cases, Knowledge, attitude, practice, Healthcare professionals, medical law, Consent, Medical negligence.
Page No: 1159-1165 | Full Text
Original Research Article
PLATELET INDICES AND DEMOGRAPHIC CORRELATES AS PREDICTIVE INDICATORS OF SEVERITY MANIFESTATIONS IN DENGUE FEVER: A PROSPECTIVE OBSERVATIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.197
Vishal Kumar C J, Suma H V, H K Manjunath, Bhargavi Mohan, Manasa R S, Ragini A K, Sudhamani, Tejas R
View Abstract
Background: Dengue fever is a major vector-borne viral illness associated with significant morbidity and mortality worldwide. Haemorrhagic complications represent the most dangerous outcomes, with thrombocytopenia as a defining feature. Platelet indices - Mean Platelet Volume, Platelet Distribution Width may serve as reliable early markers of severe disease. the objective is to evaluate changes in platelet indices among dengue patients, assess the role of age and gender in thrombocytopenia, and determine their predictive potential for dengue haemorrhagic fever. Materials and Methods: A prospective observational study was conducted at a tertiary care hospital between May 2023 and August 2023. 240 patients with dengue-like symptoms were screened, out of which 129 serologically confirmed dengue positive patients were enrolled with informed consent. Demographics and platelet indices were recorded and analysed using SPSS Software with relevant tests. Results: Significant thrombocytopenia was noticed more in younger and older age groups with around 32% of the total sample falling between the <20,000 to 50,000cells/μL, when compared to middle age groups (p < 0.05). There were no significant variations among males and females (p > 0.05). Significant difference in the mean PCT noted between subjects of different age groups. Conclusion: Younger and older age groups are more likely to experience significant thrombocytopenia. Monitoring platelet count in these high-risk groups should remain central to management. Plateletcrit was found to be a promising index. MPV and PDW did not provide clinically meaningful prognostic value. Routine integration of MPV and PDW into dengue monitoring protocols is not recommended. Keywords: Dengue, Thrombocytopenia, MPV, PDW, PCT.
Page No: 1166-1170 | Full Text
Original Research Article
PREVALENCE AND SPECTRUM OF CUTANEOUS MANIFESTATIONS AMONG THE GERIATRIC POPULATION: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.198
Lakkireddygari Sujana, Guneet Bedi, Vijay P Zawar
View Abstract
Background: Population ageing is increasing globally and in India, leading to a higher burden of age-related dermatological conditions. Geriatric dermatoses include both physiological skin changes and pathological disorders, which vary across populations and require region-specific evaluation. The objective is to assess the prevalence and pattern of physiological skin changes and dermatological disorders among geriatric patients attending a tertiary care hospital. Materials and Methods: A hospital-based cross-sectional study was conducted among 500 patients aged ≥60 years attending the dermatology outpatient department over a period of six months. Detailed clinical evaluation was performed, and dermatoses were categorized into physiological changes and pathological conditions. Data were analysed using SPSS software, and categorical variables were expressed as frequencies and percentages. Results: Among 500 patients, 57.0% were males and 43.0% were females (male-to-female ratio 1.32:1). Wrinkles and canities were universal findings (100%). Cherry angiomas (90%) and idiopathic guttate hypo melanosis (54%) were also common. Xerosis was observed in 38% of patients. Among pathological conditions, pruritus was the most common presenting symptom (55%). Bacterial infections (31%) and dermatophytosis (30%) were the most frequent dermatoses, followed by allergic contact dermatitis (28%). Psoriasis vulgaris (18.4%) was the most common papulosquamous disorder. Malignant conditions were observed in 3% of patients, with squamous cell carcinoma being the most common. Conclusion: Geriatric dermatoses demonstrate a broad spectrum of physiological and pathological conditions. While ageing-related skin changes are nearly universal, infections and eczematous disorders constitute the major disease burden. Early identification and appropriate management are essential to reduce morbidity and improve quality of life in the elderly population. Keywords: Geriatrics, Dermatoses, Xerosis, Pruritus, Infections, Dermatitis, Psoriasis, Ageing.
Page No: 1171-1179 | Full Text
Original Research Article
CROSS SECTIONAL STUDY OF MICROALBUMINURIA AND SUB CLINICAL TARGET ORGAN DAMAGE IN ESSENTIAL HYPERTENSION
http://dx.doi.org/10.70034/ijmedph.2026.2.199
Ajmeer Pasha, Nagaraja
View Abstract
Background: Essential hypertension is a major contributor to cardiovascular morbidity and mortality, largely due to progressive target organ damage that often develops silently. Microalbuminuria has emerged as an early marker of endothelial dysfunction and vascular injury and may reflect subclinical involvement of multiple organs in hypertensive patients. The aim is to assess the prevalence of microalbuminuria and its association with subclinical target organ damage in patients with essential hypertension. Materials and Methods: A hospital-based cross-sectional study was conducted among 200 patients with essential hypertension. Clinical history, blood pressure measurements, and relevant laboratory investigations were recorded. Microalbuminuria was assessed using the urine albumin-to-creatinine ratio. Subclinical target organ damage was evaluated through electrocardiography and echocardiography for left ventricular hypertrophy, fundoscopic examination for hypertensive retinopathy, renal function assessment, and carotid intima-media thickness measurement where feasible. Statistical analysis included descriptive statistics, chi-square test, independent t-test, and odds ratio estimation with a significance level of p <0.05. Results: Microalbuminuria was present in 31.5% of hypertensive patients. Individuals with microalbuminuria were significantly older and had a longer duration of hypertension. Subclinical target organ damage was observed in more than half of the study population, with left ventricular hypertrophy being the most common abnormality. Microalbuminuria showed a significant association with left ventricular hypertrophy, hypertensive retinopathy, carotid intima-media thickening, reduced renal function, and multiple organ involvement. Conclusion: Microalbuminuria is a common finding in essential hypertension and is strongly associated with early target organ damage. Routine screening for microalbuminuria may aid in early detection of vascular injury and improve cardiovascular risk stratification in hypertensive patients. Keywords: Microalbuminuria. Essential hypertension. Subclinical target organ damage.
Page No: 1180-1184 | Full Text
Original Research Article
GAP ASSESSMENT OF MALNUTRITION TREATMENT UNITS OF GWALIOR DISTRICT
http://dx.doi.org/10.70034/ijmedph.2026.2.200
Jay Sharma, Ajay Kumar Gaur, Satendra Singh Rajput
View Abstract
Background: Severe Acute Malnutrition (SAM) affects 47 million children globally, with India bearing the highest burden. Despite establishing 1,151 Malnutrition Treatment Units (MTUs), including 318 in Madhya Pradesh, progress remains limited. NFHS-5 data shows only slight improvements in child nutrition. This study evaluates the structure and functioning of MTUs in Gwalior district to identify gaps and enhance their effectiveness. Materials and Methods: A qualitative study was conducted over a period of 2 years. Total 4 out of 30 malnutrition treatment units were selected via purposive sampling method. Outcome report was also evaluated for each of the selected MTUs. A focussed group discussion was done with MTU staff to identify barriers in facility-based management of SAM. A checklist was used to evaluate the availability of human resources, infrastructure and logistical capabilities of each of these units. Further, A quick assessment checklist was created to expedite the assessment of MTUs. Results: Total Defaulters were very high (218) 50.93% in all four MTUs. There were deficiencies in terms of logistic, infrastructure and human resources. 17 posts of nursing staff were vacant. This study identified barriers to SAM management in public healthcare through four focussed group discussions, highlighting issues in the health system, hospital settings, and demand side. Conclusion: This gap analysis highlights the issues and challenges in the functioning of malnutrition treatment units of Gwalior district. The high prevalence of malnutrition in the region warrants the need to understand the deficiencies in the functioning of these units and intervention for effective functioning of MTUs. Keywords: Malnutrition treatment units (MTU), severe acute malnutrition (SAM), Quick Assessment Checklist (QuAC).
Page No: 1185-1190 | Full Text
Original Research Article
DETERMINANTS OF INFANT AND YOUNG CHILD FEEDING PRACTICES AND MATERNAL KNOWLEDGE AMONG MOTHERS OF CHILDREN UNDER TWO YEARS IN RURAL KOLAR: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.201
Pravallika Kotamreddy, James Daniel S.
View Abstract
Background: Appropriate Infant and Young Child Feeding (IYCF) practices within two years of birth are critical in ensuring optimal growth & development of children. World Health Organization suggests that breastfeeding should be initiated early, exclusive breastfeeding for the initial six months from birth and that complementary feeding should be introduced at the right time and breastfeeding continued till two years or more. In spite of these evidence-based guidelines, inappropriate feeding practices have continued to exist in India. To improve the child nutrition outcomes, a holistic understanding of the behavioural determinants of maternal knowledge that affect feeding practices is necessary. This study has evaluated the determinants of IYCF behaviours and assessed maternal knowledge in mothers with children aged 0-24 months in rural Kolar, Karnataka. Materials and Methods: A prospective, cross-sectional study was carried out in 107 mothers with children aged 0-24 months residing in randomly selected four rural villages of Kolar. An interviewer-administered questionnaire based on WHO IYCF indicators was used to collect data. Maternal knowledge, attitude and feeding practices were evaluated and categorized based on predetermined scoring systems. Socio-demographic variables were expressed by descriptive statistics. Chi-square test and multivariable logistic regression analysis were used to find out determinants of appropriate IYCF practices. Results: Out of the 107 participants, 59.8% of mothers demonstrated poor knowledge of recommended feeding practices, and only 5.6% of mothers showed good knowledge. Most of the mothers (68.2%) had a neutral attitude regarding IYCF practices. Only 34.6% of mothers practised appropriate IYCF behaviours, and 65.4% followed inappropriate feeding behaviours. The multivariate logistic regression analysis revealed that maternal attitude was the only significant predictor of appropriate feeding practices (AOR 2.49; 95% CI: 1.00-6.21; p = 0.049). Conclusion: The research found that there were significant gaps in maternal knowledge and feeding habits in rural Kolar. The decisive factor that determined the adoption of appropriate feeding practices was behavioural determinants, especially maternal attitude. Enhancing community-based counselling and behaviour-change communication initiatives can thus be useful towards enhancing infant feeding behaviour and child nutrition. Keywords: Breastfeeding, Child Nutrition, Complementary Feed, Infant and Young Child Feeding, Maternal Knowledge, Maternal Practices.
Page No: 1191-1197 | Full Text
Original Research Article
PREVALENCE OF PSYCHOSOCIAL PROBLEMS AMONG SCHOOL-GOING ADOLESCENTS IN RURAL AND URBAN CHENNAI-A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.202
K.S. Vidhyalakshmi, Hetal Tejas Mer, Ruma Dutta, Mrs.Merlin G
View Abstract
Background: Adolescence is a critical period marked by rapid emotional, physical, and social changes. Psychosocial problems during this phase can hinder overall development, especially with differing exposures in urban and rural settings. Aim: To determine and compare the prevalence of psychosocial problems among school-going adolescents in urban and rural areas of Chennai. Materials and Methods: A school-based comparative cross-sectional study was conducted among 374 adolescents aged 11–14 years, selected from two schools (one urban and one rural) in Chennai between January 2025 and April 2025. Psychosocial status was assessed using the validated Pediatric Symptom Checklist–Youth Report (Y-PSC). Data were entered and analyzed using SPSS version 28. Descriptive statistics were used to summarize the data, and inferential statistics were applied to assess associations. The Chi-square test was used for categorical variables and a p-value of <0.05 was considered statistically significant. Results: Among the 374 school-going adolescents aged 11–14 years included in the study (urban = 187, rural = 187), psychosocial problems were significantly more prevalent among urban adolescents (28.8%) compared to rural adolescents (1.6%). Urban students reported higher levels of loneliness and low self-esteem. However, on univariate logistic regression analysis, place of residence was not found to be a statistically significant predictor of psychosocial problems. Female gender (OR = 1.76; 95% CI: 1.10–2.82; p = 0.019) and third birth order (OR = 2.65; 95% CI: 1.02–6.87; p = 0.045) were identified as significant predictors, whereas other sociodemographic variables were not significantly associated. Conclusion: Psychosocial problems were more prevalent among urban adolescents at the descriptive level; however, residence was not an independent predictor after adjustment. Female gender and higher birth order were significantly associated with psychosocial problems. These findings emphasise the importance of early identification through school-based screening and targeted interventions to address adolescent mental health concerns effectively. Keywords: Mental Health, Prevalence, School Health Services, Screening.
Page No: 1198-1205 | Full Text
Original Research Article
A RETROSPECTIVE ANALYSIS OF THE CYTOMORPHOLOGICAL SPECTRUM OF SALIVARY GLAND LESIONS USING THE MILAN SYSTEM FOR REPORTING SALIVARY GLAND CYTOPATHOLOGY
http://dx.doi.org/10.70034/ijmedph.2026.2.203
Poornima Mishra, Marisha Srivastava, Anmoldeep
View Abstract
Background: Salivary gland lesions are a heterogeneous group ranging from inflammatory conditions to benign and malignant neoplasms. Fine-needle aspiration cytology (FNAC) plays a key role in their evaluation. The Milan System for Reporting Salivary Gland Cytopathology (MSRSGC) provides a standardized framework for diagnosis and risk stratification. The aim is to evaluate the cytomorphological spectrum of salivary gland lesions using the Milan system and assess its diagnostic utility. Materials and Methods: This retrospective study included 45 cases of salivary gland lesions evaluated by FNAC. Cases were categorized according to MSRSGC, and histopathological correlation was performed wherever available. Risk of malignancy (ROM) and diagnostic accuracy parameters were calculated. Results: The majority of cases were benign, with pleomorphic adenoma being the most common lesion (37.8%). Category IV-a was the most frequent Milan category (37.8%). ROM increased progressively to Category VI (100%). FNAC showed sensitivity of 75%, specificity of 90%, and diagnostic accuracy of 87.5%. Conclusion: The Milan system is a reliable tool for risk stratification of salivary gland lesions. FNAC demonstrates high specificity and accuracy, supporting its role as an effective initial diagnostic modality. Keywords: FNAC, Salivary gland lesions, Milan system, Cytology, Risk of malignancy.
Page No: 1206-1211 | Full Text
Original Research Article
INTERSCALENE VERSUS REDUCED-VOLUME SUPRACLAVICULAR NERVE BLOCK IN SHOULDER SURGERY: A RANDOMIZED EVALUATION
http://dx.doi.org/10.70034/ijmedph.2026.2.204
Ekta Raja, Rashmi Chaudhari, Devanand Pawar
View Abstract
Background: Interscalene block (ISB) is considered the gold standard for analgesia in shoulder surgery but is frequently associated with hemidiaphragmatic paralysis (HdP). Supraclavicular block (SCB) may offer a potential diaphragm-sparing alternative. This study aimed to compare the incidence of HdP and respiratory outcomes between ultrasound-guided ISB and SCB. Materials and Methods: In this randomized study, 80 patients undergoing shoulder surgery were allocated to receive either ISB (n = 40) or SCB (n = 40) using 20 mL of 0.5% bupivacaine. Diaphragmatic excursion was assessed using M-mode ultrasonography at baseline, 30 minutes at 3hours and 6 hours post-block. Oxygen saturation (SpO₂), oxygen supplementation requirements, and perioperative complications were recorded. HdP was classified as partial (25–75% reduction in excursion) or complete (≥75% reduction or paradoxical movement). Results: The incidence of total HdP was significantly higher in the ISB group compared with the SCB group (52.5% vs 10%, p < 0.0001), while partial HdP was also more frequent with ISB. Oxygen saturation was significantly lower in the ISB group at multiple postoperative time points (30 minutes to 6 hours), and oxygen supplementation was required more frequently (52.5% vs 10%, p < 0.0001). Diaphragmatic excursion at 30 minutes was significantly reduced in the ISB group (p < 0.0001). No significant differences were observed in other complications, including dyspnoea, Horner syndrome, hematoma, or paraesthesia. Conclusion: Although SCB does not completely eliminate hemi diaphragmatic dysfunction, it is associated with significantly lower incidence and severity of HdP compared with ISB and provides better preservation of postoperative oxygenation. SCB may be a safer alternative in patients where respiratory function preservation is a priority. Keywords: Interscalene block; supraclavicular block; hemi diaphragmatic paralysis; diaphragmatic excursion.
Page No: 1212-1217 | Full Text
Original Research Article
CLINICAL AND HISTOPATHOLOGICAL CORRELATION OF BREAST LUMPS WITH DIAGNOSTIC ACCURACY OF TRIPLE ASSESSMENT IN A RURAL TERTIARY CARE SETTING OF KONKAN REGION
http://dx.doi.org/10.70034/ijmedph.2026.2.205
Sanish Shringarpure, Swapnaja Shringarpure, Gaysmindar Shrikant Dadarao, Chinmay Mangesh Dake, Taide Ankit Arvind
View Abstract
Background: Breast lumps represent one of the most common clinical presentations in female patients, with causes ranging from benign fibroadenomas to malignant carcinomas. In rural healthcare settings, reliance on clinical evaluation remains high due to limited access to advanced diagnostics. The objective is to study to study the clinical profile, diagnostic accuracy, and histopathological correlation of breast lumps using the triple assessment approach in a rural tertiary care center. Materials and Methods: A prospective study was conducted on 100 female patients presenting with palpable breast lumps. Each case underwent clinical examination, mammography, FNAC, and subsequent histopathological evaluation. Diagnostic performance of clinical examination was compared against mammography and FNAC. Results: Among the study population, 54% had malignant lumps while 46% were benign. Clinical examination demonstrated high sensitivity (96.08% vs. mammography; 96.23% vs. FNAC) and specificity (83.67% and 87.23%, respectively). Invasive ductal carcinoma was the most common malignancy (83.35%), and fibroadenoma was the most frequent benign lesion (41.3%). Majority of malignant cases were staged as IIIA or higher. Conclusion: Triple assessment remains a reliable diagnostic method for breast lumps. In resource-limited rural settings, clinical examination plays a critical role in early diagnosis and guiding management decisions. Keywords: Breast Lump, Clinical Examination, FNAC, Mammography, Histopathology.
Page No: 1218-1221 | Full Text
Original Research Article
SILENT RESISTANCE: EMERGING ANTIMICROBIAL PATTERNS IN VAGINAL INFECTIONS AT A RURAL MEDICAL COLLEGE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.206
Sangeetha Karunanithi, Jayashree Veerasamy, Sindhu Gunasekaran, Sankareswari Raja
View Abstract
Background: Abnormal vaginal discharge is a common clinical presentation among women of reproductive age, with bacterial vaginosis, vulvovaginal candidiasis, and trichomoniasis being the predominant causes. Increasing antimicrobial resistance, biofilm formation, and shifts in vaginal microbiota have reduced the effectiveness of syndromic management, particularly in resource-limited settings. Local microbiological surveillance is essential to guide empirical therapy and antimicrobial stewardship. The objective is to determine the spectrum of pathogens associated with vaginal discharge and to assess their antimicrobial susceptibility patterns in a rural medical college hospital. Materials and Methods: A prospective, hospital-based cross-sectional observational study was conducted over one year in the Departments of Obstetrics & Gynaecology and Microbiology at a rural tertiary care hospital. A total of 449 women aged 18–45 years presenting with abnormal vaginal discharge were included using convenience sampling. Vaginal swabs were collected under aseptic conditions and processed using standard microbiological techniques. Antimicrobial susceptibility testing was performed using the Kirby–Bauer disc diffusion method following CLSI guidelines. p-value< 0.05 considered statistically significant. Results: A total of 472 isolates were obtained from 449 participants. Normal vaginal flora constituted 47.7% of isolates followed by Gram-negative bacteria (21.0%), Gram-positive bacteria (14.8%), and fungal growth (16.1%). The most common pathogens were methicillin-resistant coagulase-negative staphylococci (10.8%), Enterococcus (10.4%), and Candida species (16.1%). Gram-negative organisms more prevalent in older age groups with p -value <0.001. Enterobacteriaceae showed high resistance to beta-lactam antibiotics and Gram-positive organisms exhibited high resistance to penicillin but showed relatively better susceptibility to linezolid and vancomycin. Multidrug-resistant organisms including vancomycin-resistant enterococci and a pan-drug-resistant Klebsiella isolate were also identified. Conclusion: The study reveals a high burden of pathogenic vaginal isolates with emerging antimicrobial resistance in a rural tertiary care setting. The presence of multidrug-resistant organisms highlight the limitations of syndromic management. These findings underscore the need for microbiology-guided therapy, routine local antibiogram surveillance and strengthen the antimicrobial stewardship to ensure effective management of vaginal infections. Keywords: Vaginal discharge, Antimicrobial resistance, Rural health, Antibiogram.
Page No: 1222-1227 | Full Text
Original Research Article
THE IMPACT OF LINE PROBE ASSAY ON EARLY DIAGNOSIS, TREATMENT INITIATION AND OUTCOME FOR SUSPECTED TB PATIENTS IN A TERTIARY CARE HOSPITAL
http://dx.doi.org/10.70034/ijmedph.2026.2.207
Sweta Rupala, Sangita Rajdev, Summaiya Mullan
View Abstract
Background: Rapid diagnosis and timely initiation of appropriate therapy are critical for tuberculosis (TB) control, particularly in multidrug-resistant TB (MDR-TB). Conventional culture-based drug susceptibility testing (DST), although considered the gold standard, is time-consuming and delays treatment decisions. Line Probe Assay (LPA) offers rapid molecular detection of Mycobacterium tuberculosis and associated drug resistance. The objective is to evaluate the impact of LPA on early diagnosis, treatment initiation, and outcomes in suspected TB patients. Materials and Methods: A retrospective comparative study was conducted at a tertiary care hospital. Culture-based diagnostics from May–July 2023 (n=285) were compared with LPA-based diagnostics from August–October 2023 (n=314). Data were collected from laboratory records and the Nikshay portal. Parameters analyzed included positivity rates, resistance patterns, and time from diagnosis to treatment initiation. Results: Culture detected 20 positive cases (7%), including 5 MDR-TB, whereas LPA detected 112 positive cases (35.7%), including 10 MDR-TB and additional mono-resistant cases. The mean time to treatment modification in MDR-TB cases decreased from 60 days (culture) to 36 days (LPA). LPA enabled earlier identification of resistance, reducing duration of ineffective therapy. Conclusion: LPA significantly reduces diagnostic delay and facilitates earlier initiation of appropriate therapy, especially in MDR-TB cases. Incorporation of LPA into routine diagnostic workflows can improve patient outcomes and reduce transmission. Keywords: Tuberculosis, Line Probe Assay, Multidrug-resistant TB, Drug Susceptibility Testing, Rapid Diagnosis.
Page No: 1228-1231 | Full Text
Original Research Article
CLINICAL PROFILE AND DETERMINANTS OF PEDIATRIC URINARY TRACT INFECTION: A MULTICENTRIC STUDY FROM SOUTHERN INDIA
http://dx.doi.org/10.70034/ijmedph.2026.2.208
Sudheera Sulgante, Vinod Khelge, Pooja Shivkumar
View Abstract
Background: Urinary tract infection (UTI) is one of the most common bacterial infections in children, often presenting with nonspecific symptoms and leading to significant morbidity if untreated. Understanding its clinical profile, microbial patterns, and determinants is vital for improving management and preventing recurrence, particularly in the Indian context. The aim is to evaluate the clinical profile and determinants of urinary tract infection among pediatric patients attending three private hospitals in southern India. Materials and Methods: A hospital-based, cross-sectional study was conducted over six months across three private tertiary care hospitals. A total of 150 children (1 month-14 years) diagnosed with UTI based on clinical features and urine culture were included. Demographic, clinical, and laboratory data were recorded. Urine samples were processed for culture and antibiotic sensitivity using CLSI guidelines. Comparative analyses were performed for upper vs. lower UTI, age groups, microbial susceptibility, and recurrence predictors. Statistical analysis was done using SPSS v26; p < 0.05 was considered significant. Results: Upper UTI was diagnosed in 62 (41.3%) and lower UTI in 88 (58.7%) children. Fever ≥38.5°C (88.7% vs. 44.3%), vomiting (62.9% vs. 35.2%), elevated CRP (32.4 ± 18.7 vs. 11.6 ± 10.2 mg/L), and longer hospital stay (3.7 ± 1.6 vs. 1.8 ± 1.1 days) were significantly more frequent in upper UTI (p < 0.001). Infants often had nonspecific symptoms, whereas older children predominantly had dysuria and urgency. E. coli was the predominant isolate (70.7%), followed by Klebsiella (14%) and Enterococcus (6%). Nitrofurantoin and amikacin showed the highest sensitivity (83% and 91.5%, respectively), while 38.7% of E. coli isolates were ESBL producers. Recurrence occurred in 24.7% of cases and was independently associated with prior UTI (aOR 3.10), constipation (aOR 2.01), dysfunctional voiding (aOR 2.72), poor hygiene (aOR 2.28), VUR (aOR 3.52), and ESBL pathogens (aOR 2.21). Completion of antibiotic therapy was protective (aOR 0.48, p = 0.046). Conclusion: Pediatric UTIs are predominantly caused by E. coli with emerging multidrug resistance. Clinical severity is greater in upper UTI, and recurrence is influenced by anatomical, behavioral, and microbial factors. Timely diagnosis, rational antibiotic use, and hygiene education are essential to reduce recurrence and long-term renal morbidity. Keywords: Pediatric urinary tract infection; Clinical determinants; Antimicrobial resistance; Vesicoureteral reflux; Southern India.
Page No: 1232-1238 | Full Text
Original Research Article
ANALYSIS OF MRI FINDINGS IN PATIENTS DIAGNOSED WITH KNEE OSTEOARTHRITIS: AN INSTITUTIONAL BASED STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.209
P Bahrath Kumar
View Abstract
Background: Osteoarthritis of the knee is a common degenerative joint disorder and an important cause of pain, stiffness, restricted mobility, and functional disability. Magnetic resonance imaging is useful in the evaluation of knee osteoarthritis because it provides detailed assessment of articular cartilage, subchondral bone, menisci, ligaments, synovium, and joint fluid. MRI can detect early and advanced structural abnormalities that may not be completely assessed on routine radiography. The aim is to analyze MRI findings in patients diagnosed with knee osteoarthritis. Materials and Methods: This hospital-based observational study was conducted in the Department of Radio Diagnosis, Malla Reddy Institute of Medical Sciences, Hyderabad, Telangana, India. A total of 74 patients with clinical features suggestive of knee osteoarthritis were included. Patients of either sex presenting with knee pain, stiffness, swelling, crepitus, restricted movement, or functional limitation underwent MRI examination of the symptomatic knee. MRI was performed using standard sequences in sagittal, coronal, and axial planes. Findings were assessed for cartilage loss, osteophytes, joint effusion, meniscal degeneration, meniscal tear, bone marrow edema, subchondral cysts, synovial thickening, ligament abnormalities, and Baker’s cyst. Data were analyzed using IBM SPSS Statistics. Frequencies, percentages, and appropriate statistical tests were applied, and a p-value of less than 0.05 was considered statistically significant. Results: The majority of patients were in the 51–60 years age group, 24 patients, 32.43%, followed by 61–70 years, 21 patients, 28.38%. Females were more commonly affected, 41 patients, 55.41%, than males, 33 patients, 44.59%. Moderate osteoarthritis was the most common MRI grade, seen in 34 patients, 45.95%, followed by severe osteoarthritis in 22 patients, 29.73%, and mild osteoarthritis in 18 patients, 24.32%. Cartilage thinning/loss was the most frequent MRI finding, observed in 60 patients, 81.08%, followed by osteophytes in 48 patients, 64.86%, joint effusion in 45 patients, 60.81%, meniscal degeneration in 42 patients, 56.76%, and meniscal tear in 36 patients, 48.65%. Increasing age showed a statistically significant association with MRI severity of osteoarthritis, p=0.001. Bone marrow edema, joint effusion, meniscal tear, synovial thickening, and subchondral cysts were significantly associated with advanced MRI severity. Conclusion: MRI is a valuable modality for comprehensive evaluation of knee osteoarthritis. It helps in detecting cartilage loss, meniscal abnormalities, subchondral bone changes, synovial involvement, and joint effusion, and is useful for assessing disease severity. Keywords: Knee Osteoarthritis; Magnetic Resonance Imaging; Cartilage Loss; Meniscal Tear; Joint Effusion.
Page No: 1239-1244 | Full Text
Original Research Article
SUPRAPATELLAR VERSUS INFRAPATELLAR NAILING FOR TIBIAL SHAFT FRACTURES: A COMPARISON OF SURGICAL AND CLINICAL OUTCOMES BETWEEN TWO APPROACHES
http://dx.doi.org/10.70034/ijmedph.2026.2.210
Yogendra Bhaskar Nehete, Yogesh Savliram Gangurde, Mahesh Fakkad Mule
View Abstract
Background: Intramedullary nailing is the standard treatment for tibial shaft fractures. While the infrapatellar (IP) approach has been traditionally used, the suprapatellar (SP) approach has gained popularity due to potential advantages in alignment and functional outcomes. However, comparative evidence between the two approaches remains limited and heterogeneous. The objective is to compare suprapatellar and infrapatellar intramedullary nailing in terms of surgical and clinical outcomes in patients with tibial shaft fractures. Materials and Methods: A prospective, interventional, randomized study was conducted over 18 months including 40 adult patients with tibial shaft fractures. Patients were randomly allocated into two groups: SPN (n=20) and IPN (n=20). Baseline demographic and fracture characteristics were comparable between groups. Intraoperative parameters (operative time, blood loss), clinical outcomes (time to union, range of motion), and functional outcomes (Lysholm knee score and Lower Extremity Functional Scale [LEFS]) were assessed. Statistical analysis was performed using unpaired t-test, Chi-square test, and Fisher’s exact test, with p<0.05 considered significant. Results: The mean operative time was significantly lower in the SPN group compared to the IPN group (109.55±15.26 vs. 145.15±28.28 minutes; p<0.05). Mean blood loss was also significantly reduced in the SPN group (42.45±7.07 ml vs. 63.45±8.02 ml; p<0.05). Duration of hospital stay and time to clinical union were comparable between the groups (p>0.05). Functional outcomes were significantly better in the SPN group, with 100% patients achieving excellent LEFS scores compared to 65% in the IPN group (p<0.05). The mean Lysholm knee score was higher in the SPN group (91.05±4.62 vs. 74.90±7.92), though not statistically significant. The incidence of malalignment was significantly lower in the SPN group (5% vs. 20%; p<0.05). Conclusion: Suprapatellar intramedullary nailing offers superior intraoperative efficiency and improved functional outcomes compared to the infrapatellar approach, while maintaining comparable fracture healing. It may be considered a preferred technique for the management of tibial shaft fractures. Keywords: Tibial shaft fracture; Suprapatellar nailing; Infrapatellar nailing; Intramedullary nailing; Functional outcome; Lysholm knee score; Lower extremity functional scale; Malalignment.
Page No: 1245-1250 | Full Text
Case Series
CASE SERIES ON MUMPS MYOCARDITIS HIGHLIGHTING THE NEED OF INTRODUCTION OF MMR VACCINE TO NIS
http://dx.doi.org/10.70034/ijmedph.2026.2.211
Priyanka Patil, Abhilash Yamavaram, Deepa Phirke
View Abstract
Mumps is a vaccine preventable disease that usually occurs as parotitis and spreads by respiratory droplets. Most common complications of mumps in children are orchitis, meningitis, encephalitis, pancreatitis, hearing loss. Cardiac involvement has been considered rare but clinically significant complication of mumps infection. Through this paper we are reporting four cases of probable myocarditis in mumps patients (3 mumps suspect and 1 probable mumps). This is a prospective observational study conducted during hospital stay of the selected cases. All the cases had similar presentation with h/o parotitis which was followed by myocarditis. Myocarditis was confirmed by bedside 2D ECHO. In India very limited data is available on burden of mumps and mumps myocarditis. Mumps outbreaks have been reported from various states every 5 years and mumps vaccine is not included in national immunisation schedule(NIS). This study highlights the importance of addition of mumps vaccine to NIS and suspecting myocarditis in patients presenting with oedema, respiratory distress and cardiogenic shock in the setting of mumps. Early recognition of the symptoms and prompt treatment improves prognosis, prevents death from this serious complication. Key Words: Mumps, Myocarditis, Cardiogenic Shock, Congestive Cardiac Failure, MMR Vaccine.
Page No: 1251-1255 | Full Text
Original Research Article
COMPARATIVE STUDY OF RESULTS OF FLEXOR TENDON REPAIR FOLLOWED BY KLEINERT TRACTION PROTOCOL WITH THE EXISTING METHOD OF FLEXOR TENDON REPAIR, IMMOBILISATION & ULTRASOUND PROTOCOL
http://dx.doi.org/10.70034/ijmedph.2026.2.212
T. Sri Krishna Kumaran, S. Sathiesh
View Abstract
Background: Injuries to the hand are very commonly encountered in today’s hospital setting. The most effective method of returning strength and excursion to repaired tendons involves use of strong, resistant suture techniques followed by frequent application of controlled motion stress. Present study was aimed to compare results of flexor tendon repair followed by Kleinert traction protocol with the existing method of flexor tendon repair, immobilization & ultrasound protocol. Materials and Methods: Present study was single-center, prospective, comparative study, conducted in flexor tendon injuries (Primary repair) with post op Kleinert traction with early mobilization protocol OR with post op immobilization protocol. After fitness patients were posted for surgery & were randomly allocated to either Kleinert traction protocol OR in immobilisation & Ultrasound protocol. Results: In this study, we had total of 65 patients. A total of 149 fingers were repaired, distributed to the two protocols as follows: Kleinert traction- 74 fingers & Immobilisation and ultrasound- 75 fingers. An overall good to excellent outcome was seen in a total of 75.84% (113) fingers. fair outcome in 23 fingers and poor results in 13 fingers. good to excellent results were obtained in 74.32% (55) fingers. fair outcome in 12 fingers and poor results in 7 fingers. good to excellent results were obtained in 77.33% (58) fingers. Fair outcome in 11 fingers and poor results in 6 fingers. In Kleinert traction protocol, 12 patients who produced fair results out of which 10 had adhesion, 2 had flexion contracture. Out of 7 poor results all of them had flexion contracture and there was no tendon rupture in this group. In immobilisation and ultrasound protocol, 11 patients who produced fair results out of which 11 had adhesions, out of 6 poor results 5 rupture and 1 adhesion in this group. Conclusion: With a simple and cost effective Kleinert’s rubber band traction early mobilisation protocol we have achieved satisfactory results for repaired digital flexor tendons. Both the protocols yield results well within the accepted limits. Keywords: Kleinert’s traction, early mobilisation, digital flexor tendons injury, immobilization and ultrasound therapy.
Page No: 1256-1260 | Full Text
Original Research Article
COMPUTED TOMOGRAPHY IN THE ASSESSMENT OF ACUTE ABDOMEN: A PROSPECTIVE ANALYSIS
http://dx.doi.org/10.70034/ijmedph.2026.2.213
Sushen Kumar Kondapavuluri
View Abstract
Background: Acute abdominal pain is one of the most common presentations in emergency departments and poses a significant diagnostic challenge due to its wide spectrum of underlying causes. Rapid and accurate diagnosis is essential to guide appropriate management and reduce morbidity and mortality. Computed tomography (CT) has emerged as a key imaging modality in the evaluation of acute abdomen, offering high diagnostic accuracy and influencing clinical decision-making. Materials and Methods: This prospective observational study was conducted in a tertiary care emergency department .A total of 642 adult patients presenting with acute abdominal pain were included. All patients underwent standardized clinical evaluation followed by CT imaging. Diagnostic performance metrics including sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy were calculated. The impact of CT on management decisions and clinical outcomes was analyzed. Results: CT demonstrated high diagnostic accuracy across all major conditions, with sensitivity ranging from 87.3% to 96.9% and specificity from 94.6% to 98.2%. Emergency surgical interventions decreased from 45.0% to 33.6%, while conservative management increased from 30.4% to 50.8%. Mean hospital stay was reduced from 4.5 ± 3.4 days to 3.3 ± 2.6 days, and time to diagnosis decreased from 5.3 ± 3.0 hours to 2.7 ± 1.3 hours (p<0.001). Complication rates and unnecessary surgeries were significantly lower in the CT-guided group, with improved patient satisfaction and reduced healthcare costs. Conclusion: CT is a highly reliable and clinically impactful imaging modality in the evaluation of acute abdominal conditions. Its use significantly improves diagnostic accuracy, optimizes management strategies, reduces unnecessary interventions, and enhances patient outcomes. Keywords: Acute abdomen; Computed tomography; Diagnostic accuracy; Emergency imaging; Clinical decision-making; Multidetector CT.
Page No: 1261-1266 | Full Text
Original Research Article
ROTATOR CUFF TEARS: ULTRASOUND ASSESSMENT WITH ARTHROSCOPIC VALIDATION
http://dx.doi.org/10.70034/ijmedph.2026.2.214
Sushen Kumar Kondapavuluri
View Abstract
Background: Rotator cuff tears are a leading cause of shoulder pain and functional impairment, particularly in middle-aged and elderly populations. Accurate imaging is essential for diagnosis, classification, and management planning. While magnetic resonance imaging (MRI) is often considered a reference standard, high-resolution musculoskeletal ultrasound has emerged as a practical first-line imaging modality. Arthroscopy remains the definitive reference standard, particularly in surgically indicated cases. Aim: To evaluate the role of high-resolution musculoskeletal ultrasound in diagnosing rotator cuff tears in patients presenting with unilateral shoulder pain, with arthroscopic validation in a clinically indicated subgroup. Materials and Methods: This prospective observational study was conducted in the Department of Radiodiagnosis at a tertiary care teaching hospital over a six-month period. A total of 120 patients with unilateral shoulder pain clinically suspected to have rotator cuff pathology underwent standardized high-resolution ultrasound examination using a 7.5–12 MHz linear transducer. Ultrasound findings were categorized as normal, partial-thickness tear, or full-thickness tear. Arthroscopy was performed in 68 patients based on clinical severity and surgical indication and served as the reference standard for validation. Descriptive statistics were used, and diagnostic performance of ultrasound was calculated in the arthroscopy subgroup. Results: The mean age of the study population was 54.6 ± 11.8 years, with male predominance (58.3%). Ultrasound detected rotator cuff tears in 77 patients (64.2%), including 34 partial-thickness (28.3%) and 43 full-thickness tears (35.8%). Arthroscopy confirmed rotator cuff tears in 62 of 68 patients (91.2%), comprising 38 partial-thickness and 24 full-thickness tears. Ultrasound demonstrated a sensitivity of 94.7% for partial-thickness tears and 91.7% for full-thickness tears, with 100% specificity for both. The supraspinatus tendon was the most commonly involved, and associated findings included subacromial–subdeltoid bursitis (25.8%) and biceps tendon pathology (18.3%). Conclusion: High-resolution musculoskeletal ultrasound is a reliable and effective imaging modality for evaluating rotator cuff pathology. When validated against arthroscopy, ultrasound demonstrates excellent diagnostic accuracy for both partial- and full-thickness rotator cuff tears. Given its accessibility, cost-effectiveness, and ability to identify associated shoulder abnormalities, ultrasound serves as an excellent first-line imaging tool in patients with shoulder pain, with MRI reserved for complex cases and pre-operative planning. Keywords: Rotator cuff tear; Shoulder pain; Musculoskeletal ultrasound; Arthroscopy; Supraspinatus tendon; Diagnostic imaging.
Page No: 1267-1273 | Full Text
Original Research Article
COMPARATIVE EVALUATION OF PROPOFOL AND ETOMIDATE ON HEMODYNAMIC STABILITY DURING INDUCTION FOR ELECTIVE CORONARY ARTERY BYPASS GRAFT SURGERY
http://dx.doi.org/10.70034/ijmedph.2026.2.215
Mohammed Ziauddin, Afifa Khan, Vallabdas Priyadarshini, Venumadhavi Nalluri, Harshitha Sannidhanam, Mohd Faizan Ali, G. Anantha Lakshmi
View Abstract
Background: Patients undergoing coronary artery bypass graft (CABG) surgery are highly susceptible to hemodynamic instability during induction of anesthesia and endotracheal intubation. The choice of induction agent plays a crucial role in minimizing cardiovascular fluctuations and ensuring perioperative safety. This study was designed to compare the effects of propofol and etomidate on hemodynamic parameters and their ability to attenuate the pressor response during elective CABG surgery. Materials and Methods: This prospective observational study included 100 patients aged 45–65 years scheduled for elective CABG. Participants were equally divided into two groups: Group A received propofol (2 mg/kg) and Group B received etomidate (0.3 mg/kg) for induction. Hemodynamic parameters heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean arterial pressure (MAP) were recorded at baseline, post-induction, during intubation, and at 3 and 5 minutes post-intubation. Data were analysed using appropriate statistical methods. Results: Propofol caused significant reductions in HR, SBP, DBP, and MAP following induction, along with marked fluctuations during intubation. Etomidate demonstrated greater hemodynamic stability with minimal variations across all time points. The pressor response to laryngoscopy and intubation was significantly attenuated in the etomidate group compared to the propofol group. Conclusion: Etomidate provides superior hemodynamic stability and better attenuation of pressor response compared to propofol in patients undergoing CABG, making it a preferable induction agent in high-risk cardiac patients. Keywords: Coronary artery bypass grafting, Propofol, Etomidate, Hemodynamic stability, Pressor response, Cardiac anesthesia.
Page No: 1274-1278 | Full Text
Original Research Article
COMPARISON OF C-MAC D BLADE WITH CONVENTIONAL BLADE FOR ENDOTRACHEAL INTUBATION IN PATIENTS UNDERGOING ELECTIVE SURGERY UNDER GENERAL ANAESTHESIA: A RANDOMIZED CONTROLLED TRIAL
http://dx.doi.org/10.70034/ijmedph.2026.2.216
Deepak Vijaykumar Kadlimatti, Amith Kumar S K, MD Tariq Siddiqui, Santosh Kumar, Bhadri Narayan, Jyothi Perumal
View Abstract
Background: Laryngoscopy and endotracheal intubation are associated with haemodynamic responses that may be detrimental in susceptible patients. Videolaryngoscopes improve glottic visualization and facilitate intubation. The C-MAC videolaryngoscope is available with a conventional Macintosh-type blade (C blade) and an angulated D blade designed for difficult airways. This study aimed to compare these blades in elective surgical patients. Materials and Methods: In this randomized, prospective, comparative study, 70 ASA physical status I–II patients aged 18–80 years undergoing elective surgeries under general anaesthesia were randomized into two groups: Group C (C-MAC conventional blade, n=35) and Group D (C-MAC D blade, n=35). Intubation time, Cormack–Lehane (CL) grade, and adverse events were recorded. Statistical analysis was performed using SPSS 26.0. Results: Mean intubation time was comparable between Group C (18.27 ± 4.53 s) and Group D (19.57 ± 4.93 s; p=0.25). A statistically significant difference was observed in CL grade distribution (χ²=35.95, p=0.0001), with a higher proportion of Grade 1 views in Group C. The incidence of adverse events was comparable between groups. Conclusion: Both blades provide similar intubation times and safety in elective surgical patients. However, the C-MAC conventional blade offers superior glottic visualization and may be preferred for routine airway management. Keywords: Airway management; C-MAC videolaryngoscope; D blade; Endotracheal intubation.
Page No: 1279-1282 | Full Text
Original Research Article
NEUTROPHIL–LYMPHOCYTE COUNT RATIO AS A MARKER OF CULTURE POSITIVITY IN SEPSIS: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.217
Samit Parua, Aroma Raghav, Bankar Prashant Lalaso
View Abstract
Background: Sepsis is a life-threatening condition characterized by a dysregulated host response to infection, and microbiological cultures remain the gold standard for etiological confirmation. However, a substantial proportion of sepsis cases are culture-negative, limiting early pathogen-directed management. The neutrophil–lymphocyte count ratio (NLCR), derived from routine complete blood counts, has emerged as a potential marker of systemic inflammation. This study aimed to evaluate the role of NLCR in differentiating culture-positive from culture-negative sepsis. Materials and Methods: This single-centre cross-sectional observational study included 147 adult patients diagnosed with sepsis based on Sepsis-3 criteria. Demographic, clinical, laboratory, and microbiological data were collected at admission. NLCR was calculated using absolute neutrophil and lymphocyte counts. Patients were classified as culture-positive or culture-negative sepsis based on microbiological results. Comparisons between groups were performed using appropriate statistical tests, and receiver operating characteristic (ROC) curve analysis was used to assess the diagnostic performance of NLCR. Results: Of the 147 patients, 62 (42.2%) had culture-positive sepsis and 85 (57.8%) had culture-negative sepsis. Culture-positive patients had significantly higher total leukocyte count, absolute neutrophil count, C-reactive protein levels, and significantly lower absolute lymphocyte count. Mean NLCR was significantly higher in culture-positive sepsis compared to culture-negative sepsis (16.4 ± 6.9 vs 8.9 ± 4.8; p < 0.001). An NLCR cut-off of ≥10 showed a sensitivity of 79.0% and specificity of 67.1% for predicting culture positivity, while a cut-off of ≥15 demonstrated high specificity (89.4%). ROC analysis yielded an area under the curve of 0.82 (95% CI: 0.75–0.89), indicating good discriminatory ability. Conclusion: NLCR is a simple, inexpensive, and readily available biomarker that is significantly associated with culture-positive sepsis and demonstrates good diagnostic accuracy. It may serve as a useful adjunct in the early identification and risk stratification of sepsis, particularly in settings with limited resources. Keywords: Sepsis; Neutrophil-lymphocyte count ratio; Culture-positive sepsis; Inflammatory biomarkers; Blood culture.
Page No: 1283-1289 | Full Text
Original Research Article
COMPARATIVE EVALUATION OF CT-BASED PREOPERATIVE STAGING WITH AND WITHOUT DIAGNOSTIC LAPAROSCOPY IN GASTROINTESTINAL MALIGNANCIES
http://dx.doi.org/10.70034/ijmedph.2026.2.218
Angun Tayeng, Ojing Komut, Binita Singha
View Abstract
Background: Accurate preoperative staging of gastrointestinal (GI) malignancies is crucial for optimal management. Although contrast-enhanced computed tomography (CECT) is widely used, it has limitations in detecting occult metastases. Diagnostic laparoscopy (DL) may improve staging accuracy. The objective is to compare CT-based staging of GI malignancies with and without diagnostic laparoscopy and assess its impact on detection of metastasis and resectability. Materials and Methods: This prospective comparative study included 40 patients with GI malignancies. All underwent preoperative CT staging. Twenty patients underwent diagnostic laparoscopy (DL group), while 20 were evaluated without DL (non-DL group). Findings regarding metastasis, staging, and resectability were compared. Statistical analysis was performed using Z-test, with p < 0.05 considered significant. Results: CT scan showed lymph node involvement in 45% of patients, with no distant metastasis detected, and all cases were deemed resectable. Most patients were classified as Stage II (72.5%) (p < 0.0001). In the DL group, occult metastases were identified, including liver metastasis (35%), peritoneal metastasis (15%), and serosal involvement (35%). Resectability decreased to 65%, and 35% of patients were upstaged to Stage IV (p < 0.0001). In the non-DL group, higher rates of advanced disease were observed intraoperatively, with liver metastasis (55%), peritoneal metastasis (20%), and serosal involvement (55%). Resectability was 45%, and 55% of patients had Stage IV disease. Conclusion: Diagnostic laparoscopy enhances detection of occult metastasis and improves staging accuracy compared to CT alone. It aids in avoiding unnecessary laparotomies and optimizes surgical decision-making in GI malignancies. Keywords: Gastrointestinal malignancy, CT scan, diagnostic laparoscopy, staging, metastasis.
Page No: 1290-1296 | Full Text
Original Research Article
INTERNET ADDICTION AND ITS RELATIONSHIP WITH INSOMNIA AND OBSESSIVE COMPULSIVE DISORDERS AMONG YOUNG POPULATION IN URBAN AREA OF PRAYAGRAJ DISTRICT-A COMMUNITY BASED STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.219
Naresh Kumar Gangwar, M.A. Hassan, Anurag Varma
View Abstract
Background: The increasing digital engagement among youth has raised concerns about internet addiction and its potential impact on health outcomes such as sleep quality and obsessive-compulsive disorder (OCD). Understanding these associations is crucial for promoting healthy behavioral patterns among young urban populations. Objective: To assess internet addiction and its association with insomnia. 2. To show the relationship between internet addiction and obsessive compulsive disorders. Materials and Methods: A community-based cross-sectional study was conducted among 258 participants aged 18–24 years. Internet addiction was measured using the Internet Addiction Test (IAT), sleep quality using the Pittsburgh Sleep Quality Index (PSQI), and OCD symptoms using the Yale–Brown Obsessive Compulsive Scale (Y-BOCS). Results: The prevalence of mild internet addiction was highest (46.51%), followed by normal users (25.97%), moderate (24.81%), and severe addiction (2.71%). A highly significant association was observed between internet addiction severity and sleep quality (χ² = 114.710, p < 0.001), with higher addiction levels linked to poorer sleep, increased sleep latency, reduced sleep duration, and greater daytime dysfunction. In contrast, the relationship between internet addiction severity and OCD was not statistically significant (χ² = 6.455, p = 0.6157). Conclusion: Internet addiction is prevalent among young adults, predominantly at mild to moderate levels, and is strongly associated with impaired sleep quality. However, it does not show a significant relationship with OCD severity. These findings highlight the importance of early identification and intervention strategies to promote balanced internet use and improve sleep health among youth. Keywords: Internet Addiction, Sleep Quality, Insomnia, Obsessive-Compulsive Disorder, Young Adults, IAT, PSQI, Y-BOCS.
Page No: 1297-1303 | Full Text
Original Research Article
ASSESSMENT OF RIGHT VENTRICULAR LONGITUDINAL STRAIN AND LEFT ATRIUM STRAIN IN PATIENTS WITH ISOLATED SEVERE RHEUMATIC MITRAL STENOSIS BEFORE AND AFTER BALLOON MITRAL VALVOTOMY
http://dx.doi.org/10.70034/ijmedph.2026.2.220
Manish Khandelwal, Brajesh Kumar, Chandrabhan Meena
View Abstract
Background: Rheumatic mitral stenosis (MS) continues to be a major cause of valvular heart disease in developing countries. Chronic left atrial (LA) pressure overload leads to progressive atrial fibrosis, remodeling, and right ventricular (RV) dysfunction. Myocardial strain imaging offers a sensitive tool for detecting these functional impairments and evaluating improvement after balloon mitral valvotomy (BMV). Objective- To assess the difference in right ventricular longitudinal strain and left atrial global strain at baseline and after BMV in patients with rheumatic severe mitral stenosis, and to compare these parameters along with conventional echocardiographic indices between cases and healthy controls. Materials and Methods: This observational analytic study was conducted in the Department of Cardiology, Sawai Man Singh Medical College, Jaipur. A total of 44 patients with isolated rheumatic severe MS undergoing BMV and 44 age-matched healthy controls were enrolled. Standard echocardiographic parameters and strain imaging were obtained before and 24–48 hours after BMV. Statistical analysis was performed using Student’s t-test and chi-square test, with p ≤ 0.05 considered significant. Results: The mean mitral valve area increased significantly from 0.87±0.20 cm² to 1.52±0.17 cm² (p<0.001), while the transmitral gradient decreased from 13.20±1.68 mmHg to 4.50±0.85 mmHg (p<0.001). Both LA strain and RV strain improved significantly after BMV. Pulmonary artery systolic pressure decreased markedly, while TAPSE and RVS′ showed modest changes. Compared with controls, cases had reduced strain values and larger LA dimensions. Conclusion: BMV leads to significant improvement in LA and RV strain and reduces pulmonary pressures in severe rheumatic MS, highlighting the role of strain imaging in peri-procedural evaluation and therapeutic monitoring. Keywords: Balloon Mitral Valvotomy, Mitral Stenosis, Strain Imaging, Valvular Heart Disease
Page No: 1304-1309 | Full Text
Original Research Article
AWARENESS OF OBSTETRIC DANGER SIGNS AMONG ANTENATAL WOMEN IN RURAL SOUTH INDIA: A CROSS-SECTIONAL STUDY OF 300 PATIENTS IN A FREE TERTIARY CARE SETTING
http://dx.doi.org/10.70034/ijmedph.2026.2.221
Namita Bali, Padmasri R
View Abstract
Background: Maternal mortality continues to be a major public health concern in low-resource settings, with delays in recognizing obstetric danger signs contributing significantly to adverse outcomes (1,5). The objective is to assess the level of knowledge regarding danger signs during pregnancy and identify factors influencing awareness among antenatal women. Materials and Methods: A cross-sectional study was conducted among 300 antenatal women attending the ANC clinic at Sri Madhusudan Sai Institute of Medical Sciences and Research, Muddenahalli, Karnataka, India. Data were collected using a structured questionnaire. Knowledge scores were categorized into good, average, and poor. Statistical analysis included descriptive statistics and chi-square testing. Results: Overall, 36% of participants had good knowledge, 41% average knowledge, and 23% poor knowledge. Vaginal bleeding (68%) and abdominal pain (61%) were the most recognized danger signs, whereas convulsions (34%) and blurred vision (39%) were least recognized. Education and parity were significantly associated with knowledge levels (p < 0.05). Conclusion: Awareness of danger signs remains suboptimal despite access to free tertiary care. Strengthening structured antenatal education is essential to improve timely care-seeking and maternal outcomes. Keywords: Obstetric Danger Signs, Antenatal Women.
Page No: 1310-1312 | Full Text
Original Research Article
A COMPARATIVE STUDY OF HYPERBARIC ROPIVACAINE, ROPIVACAINE WITH DEXMEDETOMIDINE, AND ROPIVACAINE WITH FENTANYL IN SPINAL ANAESTHESIA FOR ADULT PATIENTS UNDERGOING INFRAUMBILICAL SURGERIES-A STUDY OF 90 CASES
http://dx.doi.org/10.70034/ijmedph.2026.2.222
Pooja Parecha, Deepa Gondalia, Vandana S Parmar, Ruchika Kumari, Kaushikkumar Marakana, Harshit Sarvaiya
View Abstract
Background: Spinal anaesthesia using ropivacaine is widely employed for infraumbilical surgeries, and the use of intrathecal adjuvants like dexmedetomidine and fentanyl may improve block characteristics and postoperative analgesia. The aim is to compare the intraoperative and postoperative effects of intrathecal hyperbaric ropivacaine alone and in combination with dexmedetomidine and fentanyl. Materials and Methods: A prospective randomized study was conducted on 90 patients divided into three groups of 30 each. Group RR received ropivacaine alone, Group RF received ropivacaine with fentanyl, and Group RD received ropivacaine with dexmedetomidine. Parameters assessed included onset and duration of sensory and motor block, duration of analgesia, hemodynamic changes, and adverse effects. Results: The RD group showed the fastest onset of sensory (2.50 ± 0.85 min) and motor block (4.60 ± 0.86 min), longest duration of sensory block (294.00 ± 40.22 min), motor block (275.67 ± 42.97 min), and analgesia (313.00 ± 42.28 min) compared to RF and RR groups (p < 0.001). Hemodynamic parameters remained stable and adverse effects were minimal across all groups. Conclusion: Dexmedetomidine is a superior intrathecal adjuvant to fentanyl when combined with hyperbaric ropivacaine, providing faster onset, prolonged block duration, and extended postoperative analgesia with stable hemodynamics. Keywords: Ropivacaine, Dexmedetomidine, Fentanyl, Spinal Anaesthesia.
Page No: 1313-1317 | Full Text
Original Research Article
CORRELATION OF CORRECTED QT INTERVAL WITH CARDIAC AUTONOMIC NEUROPATHY IN TYPE 2 DIABETES MELLITUS: A CROSS-SECTIONAL STUDY
http://dx.doi.org/10.70034/ijmedph.2026.2.223
G. Sudhakar
View Abstract
Background: Prolonged corrected QT (QTc) interval is an early marker of cardiac autonomic neuropathy (CAN), predisposing patients to arrhythmias, silent myocardial infarction, and sudden cardiac death. Early identification is crucial for preventing complications. Materials and Methods: A cross-sectional study was conducted among 100 patients with type 2 diabetes mellitus. Fasting blood sugar (FBS), postprandial blood sugar (PPBS), and HbA1c were measured. Resting electrocardiography was performed, and QTc was calculated using Bazett’s formula. Cardiac autonomic function was assessed using Ewing’s battery of tests. Results: The mean age was 50.54 ± 6.50 years with a slight female predominance (52%). QTc prolongation was observed in 66% of patients. Mean QTc was significantly higher in patients with HbA1c >6.5% (451.31 ± 33.87 ms) compared to those with HbA1c <6.5% (377.50 ± 37.72 ms). QTc showed a significant positive correlation with HbA1c (r = 0.39, p = 0.004) and duration of diabetes (r = 0.34, p = 0.001). Prevalence of definite CAN increased significantly with longer duration of diabetes (p < 0.001). Conclusion: QTc prolongation is significantly associated with cardiac autonomic neuropathy in T2DM patients. QTc measurement can serve as a simple, non-invasive screening tool for early detection of CAN. Keywords: Type 2 Diabetes Mellitus; QTc Interval; Cardiac Autonomic Neuropathy; Electrocardiography; HbA1c.
Page No: 1318-1323 | Full Text