Background: Substance use is prevalent among people with mental health issues, and patients with psychosis are more likely to use and misuse substances than the general population. Despite extensive research on substance abuse among the general public in Kenya, there is a scarcity of data comparing substance use among people with and without psychosis. This study investigates the association between psychosis and various substances in Kenya.Methods: This study utilized data from the Neuro-GAP Psychosis Case-Control Study between April 2018 and December 2022. The KEMRI-Wellcome Trust Research Programme recruited participants from various sites in Kenya, including Kilifi County, Malindi Sub-County, Port Reitz and Coast General Provincial Hospitals, and Moi Teaching and Referral Hospital, as well as affiliated sites in Webuye, Kapenguria, Kitale, Kapsabet, and Iten Kakamega. The collected data included sociodemographic information, substance use, and clinical diagnosis. We used the summary measures of frequency (percentages) and median (interquartile range) to describe the categorical and continuous data, respectively. We examined the association between categorical variables related to psychosis using the chi-square test. Logistic regression models were used to assess the factors associated with the odds of substance use, considering all relevant sociodemographic variables.Results: We assessed a total of 4,415 cases and 3,940 controls. Except for alcohol consumption (p-value=0.41), all forms of substance use showed statistically significant differences between the case and control groups. Cases had 16% higher odds of using any substance than controls (aOR: 1.16, 95%CI: 1.05-1.28, p=0.005). Moreover, males were 3.95 times more likely to use any substance than females (aOR:3.95; 95%CI: 3.43-4.56). All the categories of living arrangements were protective against substance use.Conclusion: The findings of this study suggest that psychotic illnesses are associated with an increased likelihood of using various substances. These findings are consistent with those of previous studies; however, it is crucial to investigate further the potential for reverse causality between psychosis and substance abuse using genetically informed methods.
Publications
Polymyxin B is a reserve antibiotic, but there has been an upsurge in its use due to a rise in multidrug-resistant gram-negative bacteria. However, nephrotoxicity and resistance concerns persist, with global resistance rates reaching 29%. Effectiveness of Polymyxin B (clinical and microbiological response) and the frequency of nephrotoxicity is not well documented in resource limited settings. Research is essential to guide optimization of Polymyxin B therapy and inform the adoption of measures for early detection of kidney injury to prevent damage. This study aimed to assess the effectiveness of Polymyxin B by evaluating clinical and microbiological responses and determining nephrotoxicity incidence using KDIGO criteria in ICU patients at Moi Teaching and Referral Hospital (MTRH). A prospective observational cohort study was conducted at MTRH ICUs between December 2021 and November 2022, on patients treated with Polymyxin B. Data on demographics, comorbidities, Polymyxin B dosage regimens, clinical responses, and microbiological results were collected. Descriptive statistics summarized patient characteristics, while associations between dosage regimens and outcomes were evaluated using Fisher's exact test and multivariate regression, with a p<0.05 considered statistically significant. Forty-four patients with a mean age of 48 years were included; 66% were male, and cerebrovascular disease was the most common comorbidity. All patients had multidrug-resistant gram-negative infections qualifying for Polymyxin B therapy. Most (89%) received monotherapy, with 86% achieving a good clinical response, 7% experiencing treatment failure, and 7% dying. Doses of 20,000–25,000 IU/Kg/day were associated with microbiological eradication and good clinical response (p<0.001), while 15,000 IU/Kg/day was associated with treatment failure. Acute kidney injury occurred in 48% of patients, with 68% developing hypomagnesemia. Polymyxin B at doses of between 20,000-25,000IU/Kg/day should be considered as a starting dose due to the association with good clinical response, with alternate-day monitoring of serum creatinine levels for early detection of nephrotoxicity.
Background: Hepatocellular carcinoma (HCC) is the most common type of primary liver cancer globally, accounting for 75-85% of all liver cancer cases with a prevalence of 16 to 32 times higher in developing countries than in developed countries. HCC is an asymptomatic malignancy with a particularly grave prognosis in black Africans. In Kenya HCC is the 13th prevalent cancer in Kenya, and 9th cause of cancer mortality. The cancer burden is greatest in areas where viral Hepatitis B or C prevalence is 8% or more. Surveillance should be tailored to those at risk as it leads to early detection of lesions while they are still amenable to surgery. We describe the clinical profile of patients with chronic liver disease and HCC in a western Kenya multidisciplinary collaborative project. Methods: The HepWek project was carried out in Western Kenya to study risk factors and to improve diagnostic capability and surgical management of HCC. This was a prospective study of patients known to have chronic liver disease or uncharacterized liver lesions requiring further workup. Chronic liver disease was defined either by follow up in the liver clinic or presence of abnormal liver function for more than 6 months. Results: A total of 200 patients were included in the final analysis. The majority were male 110/200 (55%), mean age was 47 (SD 16) years. The commonest etiology for liver disease was Hepatitis B with 90/200 (45%) cases, 30% of the patients (59/200) were diagnosed to have HCC. Risk factors for HCC were older age Odds Ratio (OR) 1.03 (95% CI 1.01 to 1.06), Hepatitis B; OR 2.66 (95% CI 1.28 to 5.51) and liver cirrhosis OR 2.41 (95% CI 1.21 to 4.81). Median alphafetoprotein (AFP) level was 111 (range 1 to 1000) ng/ml in HCC patients and majority of them had albumin bilirubin score of 3 (78%). Most patients (88%), presented with advanced stages of HCC, only 2 patients were amenable to Hepatic resection which they underwent successfully. Thirty-five (74%) out of the 59 patients with HCC died within 1 year. Conclusion: Hepatocellular carcinoma patients in western Kenya present with advanced stage of the disease that is unresectable. Hepatitis B virus infection and cirrhosis from all causes remains the most common risk factor
Background: Dexmedetomidine is the preferred drug for light sedation in intensive care units (ICU) where sedation plays an important role in patient comfort. Its advantages include shorter weaning time and earlier extubating from mechanical ventilation without respiratory depression. However, dexmedetomidine has been associated with over 50% incidence of hemodynamic adverse effects (hypotension and bradycardia) that has led to poor clinical outcomes. This limits its widespread use. Factors such as age, comorbidities, concomitant medications, dosage, baseline mean arterial pressure (MAP) and heart rate (HR) have been associated with adverse effects. Despite the high number of reported adverse effects, there is limited data on incidence and associated factors in resource limited settings. Therefore, its burden at Moi Teaching and Referral Hospital (MTRH) remains unknown. Knowledge on incidence and associated factors may inform future practice on safe use of dexmedetomidine at MTRH. Objective: To determine incidence and clinical factors associated with dexmedetomidine induced adverse effects among patients sedated with dexmedetomidine at MTRH ICU. Methods: This was a prospective observational study done at MTRH ICU. Hemodynamically stable eligible participants on dexmedetomidine were enrolled through census method between mid-April and mid-October 2022.The dependent variable was incidence of dexmedetomidine induced hemodynamic adverse effects. Data on MAP, HR were collected at selected time points within 24 hours. Independent variables which included factors such as age, gender, comorbidities, concomitant medication, renal, liver functions and drug dosages were obtained from patient records. Cutoff for hypotension was a MAP less than 60mmHg or a drop in MAP of 30% within the first hour, while bradycardia was a HR less than 60bpm or a drop in HR of 30% within first hour of drug administration. Continuous, data was summarized using means, medians and categorical data as frequencies and proportions. Fisher’s exact test and Kruskal Wallis test was used to assess for associations between categorical variables and continuous independent variables. The association between the clinical factors and development of dexmedetomidine induced hemodynamic adverse effects was analyzed using logistic regression model. Results: A total of 61 participants were recruited and 41% had traumatic brain injury. The mean age was 37 years and males were 63.9%. All participants had baseline HR>60bpm and MAP>60mmHg during drug initiation. Five patients (8.2%) developed hypotension and one (1.6%) developed bradycardia at the first hour. Mean baseline MAP was 90.49mmHg and mean decline was 5.16mmHg at 1hour. Mean baseline HR was 98.75bpm and mean decline was 0.91bpm within the first hour. Majority of patients received drug doses ranging from 0.2 to 0.7mcg/kg/hr for less than 24 hours. Lower baseline MAP <70mmHg was significantly associated with dexmedetomidine hemodynamic adverse effects (OR 2.17[95%CI 1.08-2.97, p<0.01]). However, there was no significant association between gender (0.21), baseline HR (0.88), comorbidities (0.19), concomitant medications (0.15), dose and duration (1), renal (0.28), liver functions (0.17) and occurrence of hemodynamic adverse effects. Conclusion: This study reported a low incidence of dexmedetomidine induced adverse effects compared to previous studies. Lower baseline MAP<70mmHg was an independent predictor of dexmedetomidine induced hypotension and bradycardia. Recommendations: Patients with lower baseline MAP<70mmHg and on dexmedetomidine should be monitored more frequently within the first hour.
Background: Dyslipidemia is the presence of abnormal blood lipid parameters, characterized by increased LDL-C, triglycerides, and total cholesterol but reduced HDL-C. It is a common finding in patients with T2DM, occurring at a prevalence rate of between 70% and 85%, and promotes the development of long-term cardiovascular complications, which are the leading cause of mortality in this population. Statins are the first-line drugs, but lipid control varies from patient to patient despite being widely used.Objective: To assess lipid control and the factors associated with LDL-C control in patients with type 2 DM who are on statins at a national referral hospital in Western Kenya.Methods: A retrospective study on 211 patients with type 2 DM who had been on a statin for at least three months. Data was obtained from patient records and lipid measures categorized as controlled or uncontrolled based on the Kenya National Guidelines for the Management of Diabetes Mellitus, 2018. Chi-square and Fischer’s exact test determined the association between variables. A multivariate logistic regression model was fit for variables significant at the bivariate level, and a P value of <0.05 was considered significant.Results: Most (99%) were on a single lipid-lowering drug, mainly atorvastatin, and 92% were on moderate-intensity dosing. Regarding lipid control, 50.3% had uncontrolled LDL-C, 30% had uncontrolled HDL-C, and 47% had uncontrolled triglyceride levels. Being on a high-intensity statin increased the likelihood of LDL-C control compared to moderate-intensity dosing (OR 8.57 [95% CI 4.3-16.9, P<0.001]).Conclusion: LDL-C was the most poorly controlled parameter. Patients on high-intensity statins had better LDL-C control; therefore, high-intensity statin therapy should be initiated in diabetic patients who do not achieve their LDL-C targets.
Background: Pregnancy poses specific challenges in the diagnosis of Plasmodium falciparum infection due to parasite sequestration in the placenta. The diagnosis of Plasmodium falciparum infection in pregnant mothers therefore requires highly sensitive methods in order to detect the presence of parasites. These include those that detect the presence of antigens and those that detect and quantify the presence of the malaria parasites.Objective: The study assessed the performance of mRDT diagnostic test ((PfHRP2-RDT) in the detection of malaria infection in blood samples from nulliparous pregnant women within the first trimester of pregnancy in Western Kenya.Methods: This was a prospective study on blood specimens collected from pregnant women in a malaria-endemic region in Kenya. m-polymerase chain reaction (mPCR) and mRDT tests were performed. The diagnostic accuracy of m-RDT was compared with mPCR as the gold standard for the purpose of this study.Setting: Twelve primary health facilities in Busia, Bungoma and Kakamega Counties in KenyaResults: Out of 264 mPCR positive samples, 130 were mRDT positive (true positives) while 134 were mRDT negative (false negative). And out of 441 mPCR negative samples, 41 were positive on mRDT (false positive). Thus, in comparison with mPCR, the sensitivity and specificity of mRDT to detect malaria infection in nulliparous pregnant mothers in first trimester was 49.2% and 88.9% respectivelyConclusions: The sensitivity of mRDT to detect Plasmodium falciparum infections in nulliparous pregnant mothers in the first trimester was not satisfactory compared to mPCR tests.
There is limited data on the bleeding safety profile of direct oral anticoagulants, such as rivaroxaban, in low- and middle-income country settings like Kenya. In this prospective observational study, patients newly started on rivaroxaban or switching to rivaroxaban from warfarin for the management of venous thromboembolism (VTE) within the national referral hospital in western Kenya were assessed to determine the frequency of bleeding during treatment. Bleeding events were assessed at the 1- and 3-month visits, as well as at the end of follow-up. The International Society of Thrombosis and Hemostasis (ISTH) and the Bleeding Academic Research Consortium (BARC) criteria were used to categorize the bleeding events, and descriptive statistics were used to summarize categorical variables. Univariate and multivariate logistic regression model was used to calculate unadjusted and adjusted associations between patient characteristics and bleeding. The frequency of any type of bleeding was 14.4% (95% CI: 9.3%-20.8%) for an incidence rate of 30.9 bleeding events (95% CI: 20.1-45.6) per 100 patient-years of follow-up. The frequency of major bleeding was 1.9% while that of clinically relevant non-major bleeding was 13.8%. In the multivariate logistic regression model, being a beneficiary of the national insurance plan was associated with a lower risk of bleeding, while being unemployed was associated with a higher bleeding risk. The use of rivaroxaban in the management of VTE was associated with a higher frequency of bleeding. These findings warrant confirmation in larger and more targeted investigations in a similar population.
BACKGROUND: Abacavir is a nucleoside reverse transcriptase inhibitor that is used as a component of the antiretroviral treatment regimen in the management of the human immunodeficiency virus for both adults and children. It is efficacious, but its use may be limited by a hypersensitivity reaction linked with the HLA-B*57:01 genotype. HLA-B*57:01 has been reported to be rare in African populations. Because of the nature of its presentation, abacavir hypersensitivity is prone to late diagnosis and treatment, especially in settings where HLA-B*57:01 genotyping is not routinely done. CASE REPORT: We report a case of a severe hypersensitivity reaction in a 44-year-old Kenyan female living with the human immunodeficiency virus and on abacavir-containing antiretroviral therapy. The patient presented to the hospital after recurrent treatment for a throat infection with complaints of fever, headache, throat ache, vomiting, and a generalized rash. Laboratory results evidenced raised aminotransferases, for which she was advised to stop the antiretrovirals that she had recently been started on. The regimen consisted of abacavir, lamivudine, and dolutegravir. She responded well to treatment but was readmitted a day after discharge with vomiting, severe abdominal pains, diarrhea, and hypotension. Her symptoms disappeared upon admission, but she was readmitted again a few hours after discharge in a hysterical state with burning chest pain and chills. Suspecting abacavir hypersensitivity, upon interrogation she reported that she had taken the abacavir-containing antiretrovirals shortly before she was taken ill. A sample for HLA-B*57:01 was taken and tested positive. Her antiretroviral regimen was substituted to tenofovir, lamivudine, and dolutegravir, and on subsequent follow-up she has been well. CONCLUSIONS: Clinicians should always be cognizant of this adverse reaction whenever they initiate an abacavir-containing therapy. We would recommend that studies be done in our setting to verify the prevalence of HLA-B*57:01.
Summary Genetic studies in underrepresented populations identify disproportionate numbers of novel associations. However, most genetic studies use genotyping arrays and sequenced reference panels that best capture variation most common in European ancestry populations. To compare data generation strategies best suited for underrepresented populations, we sequenced the whole genomes of 91 individuals to high coverage as part of the Neuropsychiatric Genetics of African Population-Psychosis (NeuroGAP-Psychosis) study with participants from Ethiopia, Kenya, South Africa, and Uganda. We used a downsampling approach to evaluate the quality of two cost-effective data generation strategies, GWAS arrays versus low-coverage sequencing, by calculating the concordance of imputed variants from these technologies with those from deep whole-genome sequencing data. We show that low-coverage sequencing at a depth of ≥4× captures variants of all frequencies more accurately than all commonly used GWAS arrays investigated and at a comparable cost. Lower depths of sequencing (0.5–1×) performed comparably to commonly used low-density GWAS arrays. Low-coverage sequencing is also sensitive to novel variation; 4× sequencing detects 45% of singletons and 95% of common variants identified in high-coverage African whole genomes. Low-coverage sequencing approaches surmount the problems induced by the ascertainment of common genotyping arrays, effectively identify novel variation particularly in underrepresented populations, and present opportunities to enhance variant discovery at a cost similar to traditional approaches.
Introduction Tungiasis (sand flea disease or jigger infestation) is a neglected tropical disease caused by penetration of female sand fleas, Tunga penetrans, in the skin. The disease inflicts immense pain and suffering on millions of people, particularly children, in Latin America, the Caribbean and sub-Saharan Africa. Currently, there is no standard treatment for tungiasis, and a simple, safe and effective tungiasis treatment option is required. Tea tree oil (TTO) has long been used as a parasiticidal agent against ectoparasites such as headlice, mites and fleas with proven safety and efficacy data. However, current data are insufficient to warrant a recommendation for its use in tungiasis. This trial aims to generate these data by comparing the safety and efficacy of a 5% (v/w) TTO proprietary gel formulation with 0.05% (w/v) potassium permanganate (KMnO4) solution for tungiasis treatment.Methods and analysis This trial is a randomised controlled trial (RCT) in primary schools (n=8) in South-Western Kenya. The study will include school children (n=88) aged 6–15 years with a confirmed diagnosis of tungiasis. The participants will be randomised in a 1:1 ratio to receive a 3-day two times a day treatment of either 5% TTO gel or 0.05% KMnO4 solution. Two viable embedded sandflea lesions per participant will be targeted and the viability of these lesions will be followed throughout the study using a digital handheld microscope. The primary outcome is the proportion of observed viable embedded sand fleas that have lost viability (non-viable lesions) by day 10 (9 days after first treatment). Secondary outcomes include improvement in acute tungiasis morbidities assessed using a validated severity score for tungiasis, safety assessed through adverse events and product acceptability assessed by interviewing the participants to rate the treatment in terms of effectiveness, side effects, convenience, suitability and overall satisfaction.Ethics and dissemination The trial protocol has been reviewed and approved by the University of Canberra Human Research Ethics Committee (HREC-2019-2114). The findings of the study will be presented at scientific conferences and published in a peer-reviewed journal.Trial registration numbers Australian New Zealand Clinical Trials Registry (ACTRN12619001610123); PACTR202003651095100 and U1111-1243-2294.
Most of the plants used by herbalists amongst the various Kenyan communities have not been documented despite their widespread use. The purpose of this research was to document the medicinal plants used by the herbalists from the Maasai, a community that still relies on herbal medicine to a large extent for the provision of medical services. Semistructured interviews, direct observations, group discussions, and in-depth interviews were used to collect information from the traditional healers. A total of 47 plant species belonging to 31 families were identified. They were used in the treatment of 33 medical and 4 veterinary conditions. |
Background: Warfarin is a drug with narrow therapeutic index used in the management of thromboembolic disorders. Several factors affect its plasma concentrations with a resultant risk of toxicity. We examined the database of patients on warfarin therapy in order to establish the factors that affect the stability of INR and correlated them to clinical outcomes in resource limited settings.Methods: We analysed retrospective data of patients admitted to adult medical wards at Moi Teaching and Referral Hospital (MTRH) in 2015. Inclusion criteria were patients with thromboembolic and related disorders and on warfarin treatment. Derived data included demographics, indications for warfarin use, co-prescribed drugs, co-morbidities, INR measurements, duration of hospital stay and clinical outcomes. Descriptive statistics were used to summarize the data. Pearson’s correlation coefficient was used to assess relationships between duration of hospitalization and number of INR tests. Regression splines were used to capture INR trends during the follow up period. Data was analysed using R v. 3.3.1.Results: A total of 310 patients had thromboembolic disorders, out of which 63 met the study criteria.The median age was 48 years, while the mean number of INR measurements was once every four days. Majority of patients did not achieve stable INR values, with only two having consecutive INR values within therapeutic goal. Patients who died had high INR levels. The median duration of hospital stay was 9 days (IQR: 7.0, 16.5). There was a significant correlation between length of stay in hospital and the number of times that INR were measured (Corr = 0.667, p < 0.001). The two most common indications for warfarin were DVT (64.4%) and atrial fibrillation (24.7%). All the patients had one or more comorbid conditions except for 11 with DVT alone, with cardiovascular diseases and infections being the most frequent, and on concomitant medications, majority of which are known to interact with warfarin.Conclusions: It was difficult to achieve stable INR under the prevailing conditions despite the frequent tests.The potential factors that may have contributed to the fluctuations include drug-drug interactions, frequency of INR tests, comorbidities and the short duration of hospital stay.
BACKGROUND: Nanotechnology is now considered a promising drug delivery method for orally administered hydrophobic drugs to their sites of action. The effect of nanodispersion on cellular transport and accumulation of saquinavir (SQV) was investigated. METHODS: The transport of five solid drug nanoparticle (SDN) SQV formulations along Caco-2 cell monolayers (CCM) was compared to that of standard SQV. The SDNs were prepared using SQV mesylate (20%), Pluronic F127 (10%) plus five other excipients (HPMC, PVP, PVA, Lecithin S75 and Span 80) in different proportions. Cellular accumulation in CEM parental and CEMVBL (P-gp overexpressing) cells was conducted to ascertain the effect of nanodispersion on P-gp mediated efflux of SQV. All SDN formulations were dissolved in water, whereas SQV in DMSO to improve solubility. Quantification was via HPLC. RESULTS: From transport results, an SDN sample composed of SQV mesylate/Pluronic F127 plus HPMC (70%) and had a 24% increase in apparent absorption compared to standard SQV, largely driven by a 38% reduction in basolateral to apical permeation. Additionally, the formulation and two others (SQV mesylate/Pluronic F127 alone; and + HPMC (65%)/Lecithin [5%]) accumulated more significantly in CEM cells, suggesting enhanced delivery to these cells. Moreover, accumulation and transport of the three SDNs compared well to that of SQV despite being dissolved in water, suggestive of improved dissolution. The inclusion of PVA resulted in increased efflux. CONCLUSION: The use of HPMC and Pluronic F127 produced SQV SDNs with improved permeation in Caco-2 cells and improved accumulation in CEM cells, but negative effects with PVA.
PURPOSE: Kaposi's sarcoma (KS) is a spindle cell tumor resulting from growth dysregulation in the setting of infection with human herpes virus-8 (also called KS herpes virus). Advanced KS is characterized by poor responses to antiretroviral therapy and some of the chemotherapy readily accessible to patients in low-resource areas. Gemcitabine induced partial and complete regression of AIDS-associated KS (AIDS-KS) in 11 of 24 patients in a pilot study. The current study compares the antimetabolite gemcitabine with the standard care bleomycin and vincristine (BV) in the treatment of chemotherapy-naive patients with AIDS-KS in a resource-limited setting. PATIENTS AND METHODS: Patients with persistent or progressive KS despite treatment with combined antiretroviral therapy were randomly assigned to receive gemcitabine 1,000 mg/m(2) or bleomycin 15 IU/ m(2) and vincristine 1.4 mg/m(2) given twice weekly. The main end point was objective response by bidirectional measurement, adverse events, and quality of life after three cycles of chemotherapy. RESULTS: Of 70 participants enrolled, 36 received gemcitabine and 34 received BV. Complete response was achieved in 12 patients (33.3%) in the gemcitabine arm and six (17.6%) in the BV arm ( P = .175). The partial response rate was 52.8% (n = 19) in the gemcitabine arm and 58.8% (n = 20) in the BV arm. Both study arms reported similar neurologic and hematologic adverse events; there was statistically significant baseline to post-treatment improvement in health-related quality-of-life scores. CONCLUSION: The results of this randomized, phase IIA trial demonstrate gemcitabine activity in chemotherapy-naive patients with AIDS-KS, on the basis of response rates, adverse events, and health-related quality-of-life scores.
The right to access essential medicines and medical technologies is crucial to attain the highest-quality health care for all citizens of the world. Unfortunately, in many low- and middle-income countries (LMICs) around the world, patients’ ability to access quality essential medicines still remains a critical challenge. Barriers that impact the quality of essential medicines from chronic communicable and chronic non-communicable diseases lie within three specific areas (3A’s): availability, accountability, and adherence. First, unnecessarily complex supply chain management, poor operational procedures, and inadequate financing for health lead to low availability of medicines. Second, corruption contributes to falsified and substandard medicines and low accountability of the supply chain to the patients who rely on it. Lastly, poor patient adherence to medicines is affected by low health literacy, lack of communication between providers and patients, and social stigma of diseases. Based on our on-the-ground experiences working in western Kenya, we propose solutions that target each of these challenges to improve access and quality of medicines. Through this chapter, we hope to compel chemists to apply and focus their efforts to create transformative chemical techniques with the potential to significantly improve quality of medicines, to improve patient outcomes, and to alter the delivery of care to patients all over the world.