Protecting maternal health in Rwanda | MIT News


The world is facing a maternal health crisis. According to the World Health Organization, about 810 women die every day from preventable causes related to pregnancy and childbirth. Two-thirds of these deaths occur in sub-Saharan Africa. In Rwanda, infected cesarean wounds are a leading cause of maternal mortality.

A multidisciplinary team of physicians and researchers from MIT, Harvard University and Partners in Health (PIH) in Rwanda has proposed a solution to this problem. They have developed a mobile health platform (mHealth) that uses artificial intelligence and real-time computer vision to predict infections in cesarean wounds with about 90 percent accuracy.

“Early detection of infections is an important issue worldwide, but in resource-poor areas like rural Rwanda the problem is even worse due to the lack of trained doctors and the high prevalence of bacterial infections that are resistant to antibiotics,” says Richard Ribon Fletcher ’89, SM ’97, PhD ’02, research scientist in mechanical engineering at MIT and technology leader of the team. “Our idea was to use mobile phones that could be used by community health workers to visit young mothers in their homes and inspect their wounds to check for infection.”

This summer, the team, led by Harvard Medical School professor Bethany Hedt-Gauthier, was recognized with the $500,000 first place award in the NIH Technology Accelerator Challenge for Maternal Health.

“The lives of women who give birth by cesarean section in developing countries are impacted both by limited access to quality surgery and postpartum care,” adds Fredrick Kateera, a team member at PIH. “The use of mobile health technologies to early detect and plausibly accurately diagnose patients with postoperative wound infections in these communities would be a scalable game changer in optimizing women’s health.”

Training algorithms to detect infections

The genesis of the project was the result of several chance encounters. In 2017, Fletcher and Hedt-Gauthier met during an NIH investigators meeting on the Washington Metro. Hedt-Gauthier, who by then had been working on research projects in Rwanda for five years, was looking for a solution to the gap in cesarean delivery that she and her collaborators had encountered in their research. In particular, she was interested in researching the use of cell phone cameras as a diagnostic tool.

READ:  Iranians see widespread internet outage amid mass protests | News

Fletcher, who leads a group of students in Professor Sanjay Sarma’s AutoID lab and has spent decades applying phones, machine learning algorithms and other mobile technologies to global health, was ideally suited to the project.

“When we realized that these types of image-based algorithms could aid in the home care of women after cesareans, we reached out to Dr. Fletcher as a partner as he has extensive experience developing mHealth technologies in low and middle income environments. “, says Hedt-Gauthier.

Fortunately, during the same trip, Hedt-Gauthier sat next to Audace Nakeshimana ’20, who was a new MIT student from Rwanda and would later join Fletcher’s team at MIT. With Fletcher’s mentorship, Nakeshimana founded Insightiv, a Rwandan startup applying AI algorithms to analyze clinical images, in his senior year and was a top grantee of the annual MIT IDEAS competition in 2020.

The first step of the project was to compile a database of wound images taken by health workers in rural Rwanda. They collected over 1,000 images of infected and uninfected wounds and then trained an algorithm with that data.

With this first data set, which was collected between 2018 and 2019, a central problem emerged. Many of the photographs were of poor quality.

“The quality of the wound images collected by healthcare workers varied widely and required a lot of manual work to crop and recompute the images. Because these images are used to train the machine learning model, the image quality and variability fundamentally limits the algorithm’s performance,” says Fletcher.

To solve this problem, Fletcher turned to tools he had used in previous projects: real-time computer vision and augmented reality.

Improving image quality through real-time image processing

To encourage community health workers to take higher-quality images, Fletcher and the team redesigned the wound screening mobile app and paired it with a simple paper frame. The frame included a printed calibration color pattern and another optical pattern that guides the app’s computer vision software.

READ:  Ericsson - Smart factories, Nestle deploys first private 5G network in Latin America using EP5G

Healthcare workers are instructed to place the frame over the wound and open the app, which provides real-time feedback on camera placement. Augmented Reality is used by the app to display a green tick when the phone is within range. Once in range, other parts of the computer vision software automatically color balance, crop the image, and apply transformations to correct parallax.

“By using real-time computer vision at the time of data collection, we are able to produce beautiful, clean, uniformly color-balanced images that can then be used to train our machine learning models without the need for manual data cleaning or post-processing is processing,” says Fletcher.

Using Convolutional Neural Net (CNN) machine learning models and a method called transfer learning, the software was able to successfully predict infection in cesarean wounds within 10 days of delivery with about 90 percent accuracy. Women who are predicted to be infected via the app are then referred to a clinic where they can receive a diagnostic bacterial test and be prescribed life-saving antibiotics if needed.

The app has been well received by women and health workers in Rwanda.

“The trust women place in community health workers, who have been a big promoter of the app, led to the acceptance of the mHealth tool by women in rural areas,” adds Anne Niyigena of PIH.

Using thermal imaging to eliminate algorithmic bias

One of the biggest hurdles in scaling this AI-based technology to a more global audience is algorithmic bias. When trained on a relatively homogeneous population like rural Rwanda, the algorithm works as expected and can successfully predict infection. But when images of patients with different skin colors are introduced, the algorithm is less effective.

To address this issue, Fletcher used thermal imaging. Simple thermal imaging camera modules designed to attach to a cell phone cost around $200 and can be used to capture infrared images of wounds. Algorithms can then be trained using the thermal patterns from infrared wound images to predict infection. A study published last year showed over 90 percent prediction accuracy when these thermal images were paired with the app’s CNN algorithm.

READ:  Brain–machine interfaces – The ethics of mind-reading – FBC News

While more expensive than simply using the phone’s camera, the thermal imaging approach could be used to scale the team’s mHealth technology to a more diverse, global population.

“We give healthcare workers two options: In a homogeneous population like rural Rwanda, they can use their standard phone camera using the model trained with local population data. Otherwise, they can use the more generic model that requires a thermal camera mount,” says Fletcher.

While the current generation of the mobile app uses a cloud-based algorithm to run the infection prediction model, the team is now working on a standalone mobile app that does not require internet access and also takes into account all aspects of maternal health. from pregnancy to after birth.

In addition to developing the library of wound images used in the algorithms, Fletcher works closely with former student Nakeshimana and his team at Insightiv to develop the app, using the Android phones that are made locally in Rwanda. PIH will then conduct user testing and field-based validations in Rwanda.

In the development of the comprehensive maternal health app, privacy and data protection are given top priority.

“As these tools are developed and refined, greater attention must be paid to patient privacy. More data security details should be incorporated so that the tool fills the gaps it aims to bridge and maximizes user trust, which will eventually favor its adoption at a larger scale,” says Niyigena.

The award-winning team includes: Bethany Hedt-Gauthier of Harvard Medical School; Richard Fletcher of MIT; Robert Riviallo of Brigham and Women’s Hospital; Adeline Boatin of Massachusetts General Hospital; Anne Niyigena, Frederick Kateera, Laban Bikorimana and Vincent Cubaka of PIH in Rwanda; and Audace Nakeshimana ’20, founder of Insightiv.ai.



Source link