I’m a Scientist Get me Out of Here!

danielfovargueDan Fovargue is a researcher on the FORCE imaging project in the Department of Biomedical engineering. Here he describes his experience taking part in I’m a Scientist, Get Me Out of Here!”

This post originally appeared on the King’s Engaged Research Network blog. 

Why did you want to engage the public? 

I recently participated in an I’m a Scientist: Get me out of here event. Past events had piqued my interest, as I was drawn to the opportunity to explain and think about a range of topics in science. The competition aspect of the event also seemed really fun. So this time around, when I saw that they were running a medical physics zone I was especially keen to join, knowing that the focus would be on topics related to my research.

I work on MR elastography, which is an imaging and engineering method for non-invasively measuring tissue stiffness. Although this is a method with much potential, it is still somewhat unknown, especially in the public. It seems then, that elastography could benefit from some outreach and exposure, so I was additionally motivated to participate in this event.

 I'm a Scientist, Get me out of here! logo

Who did you engage with and what did you do?

The events are competitions between five scientists focusing in a certain area. The scientists answer questions from students (years 7-12) both by posting responses on a website and by discussing during live chat sessions. The students vote for a scientist based on how the scientists answered their questions. The most common questions covered the why’s and how’s of becoming a scientist. These were followed by an assortment of science questions across many disciplines, including the cool (black holes, artificial intelligence) and the controversial (big bang, evolution, climate change).

Of course, there were plenty of questions related to medical physics. One helpful feature of the competition was that the scientists had profiles with information on their research (as well as hobbies). This way, the students could tailor questions to each scientist, including asking about their specific work and research. However, the philosophy of the event is to allow any question to be asked, so the students don’t have to stick to science. This results in another category of random and silly, but usually interesting, questions. 

What was the impact?

Fortunately, I got to answer lots of questions about how elastography works, what we can apply it to, and how it improves diagnoses. I even had a few rather insightful questions regarding the specifics of the physical processes in elastography, mostly questioning what types of mechanical waves can be seen with MRI and how this relates to measuring tissue stiffness. I also had the opportunity to highlight the work of other members of the elastography research group at KCL and other collaborators. I think the students were impressed that people I work with are currently applying elastography to a broad range of diseases like heart failure, breast cancer, and liver fibrosis.

Hopefully, even the students who aren’t currently too interested in science were able to get something out of this event. Part of the point is to simply show that scientists are normal enough people and break down the walls of the ivory tower. I tried to take this to heart, so I made sure to be myself and discuss a variety of topics with the students. 

How did it influence your research/you as a researcher?

Explaining my research and other scientific topics in this format was challenging but very rewarding. It was interesting to go from a meeting with my PI to answering a year 8 student’s question on elastography. The students also asked a lot of big picture questions, like where my work fits in with cancer treatment in general. I had to do some reading on cancer during the event to answer these questions appropriately which, in turn, gave me a better appreciation and understanding of the clinical side of medical physics.

During the live chat sessions there was very little time to plan out answers. I would read a question, try to think of a clever response, spend a few seconds writing, sometimes read back over it, and then move on to the next question. I usually like to think very carefully when explaining concepts, but the pace here did not allow for this. So, although, these 30 minute sessions could be exhausting, they really helped me improve my ability to explain concepts quickly and increase my confidence while doing so. There was just no time to second guess myself.

Overall this was a really fun experience. Oh! And I won!

If you’re interested in taking part in I’m A Scientist yourself, take a look at their website: https://imascientist.org.uk/  

Student’s Mission to Improve Prenatal Imaging in Rural Bangladesh

Faisel Alam, a Masters student from the division of Imaging Science has been working with a new charity launched by KCL students to improve access to medical imaging for pregnant mothers in rural Bangladesh. He’s just returned from a trip to the country where they worked alongside local medical professionals leading seminars on best practice and offering access to healthcare to patients who otherwise may not have been able to receive it.

Maternal Aid Association (MAA) is a grassroots student led charity striving to improve situations in resource-poor settings such as Bangladesh, to bring about safe, effective, high quality maternal healthcare. This is the first overseas trip MAA have taken with the aim of providing high quality long-term sustainable maternal care in rural areas of Bangladesh. MAA has established strong links with British healthcare professionals, Bangladeshi universities, medical professionals and healthcare students in Bangladesh.


The trip was led by Faisel Alam, MRes Medical Imaging Sciences postgraduate masters’ student at King’s College London. Faisel received a scholarship from the Medical Research Council (MRC) and completed his MRes research project on access to medical imaging across the developing world, using maternal care in Bangladesh as a case study. This project was completed under the supervision of Dr Gregory Mullen, Senior Lecturer in Imaging Biology and Professor Philip Blower from the Division of Imaging Sciences and Biomedical Engineering.

The team of student volunteers spent two weeks working alongside Bangladeshi healthcare professionals from the Sylhet Women Medical College, MAG Osmani Medical College and in varying hospital settings. While there, the team also prepared and led a seminar on maternal care at Sylhet Women Medical College using evidence-based best practice and reflections from their international volunteering experiences. The team participated in three days of health camps at Balaganj, Sylhet – offering free basic health checks in parts of rural Bangladesh. This reached hundreds of local residents that would otherwise not normally have access to basic healthcare.

The team also received basic sonography training from Mrs Susan Halson-Brown – the MSc Ultrasound Lead, and had planned to deliver basic sonography using a portable ultrasound device to pregnant mothers in a rural village in Bangladesh. However, the team faced difficulty delivering the ultrasound station due to unforeseen circumstances around securing trained and qualified sonographers at the rural village health complex from the local hospitals. Nonetheless, this is an area the team will be developing and expanding on in future years.

MAA was founded by Aqil Jaigirdar, a 3rd year King’s College London medical student, and the team comprises current King’s College London healthcare students supported by Professor Janice Rymer, Vice President for the Royal College of Obstetrics and Gynaecology, and Dr Daghni Rajasingham, Consultant Obstetrician Guy’s and St Thomas’ NHS Foundation Trust.

Follow their progress on Snapchat @MaaCharityUK, on Facebook facebook.com/maacharityuk and on Twitter @maacharityuk or view footage from their health complex in a rural Bangladeshi village, Balaganj.425714415_83114_4162099666314729615

Making reproducible research as natural as breathing

Peter CharltonPeter Charlton is a PhD student at King’s College London working as part of the Hospital of the Future (HotF) project. The overall aim of the HotF project is to provide early identification of hospital patients who are deteriorating. Peter’s work focuses on using wearable sensors to continuously assess patients’ health.

One of the key aims of the HotF project is to develop a technique to continuously monitor a patient’s “respiratory rate”: how often they breathe. Respiratory rate often changes early in the progression of a deterioration, giving advanced warning of a severe event such as a heart attack. However, it is currently measured by hand by counting the number of times a patient breathes in a set period of time. This approach is time-consuming, inaccurate, and only provides intermittent measurements. The alternative approach which I’m working on is to estimate respiratory rate from a small, unobtrusive, wearable sensor.

Wearable sensors are currently routinely used to monitor heart rate and blood oxygenation levels. It turns out that the signals which provide these measurements are subtly influenced by respiration, as demonstrated below. If these subtle changes can be extracted reliably, then we could monitor respiratory rate “for free”, without the need for any additional sensors. This may provide all-important information on changes in a patient’s health, allowing clinicians to identify deteriorating patients earlier.


The heart rate is clearly visible in this signal since each spike corresponds to a heart beat. The spikes also vary in height with each of the four breaths. These subtle changes can be used to estimate respiratory rate.

So what’s all this got to do with reproducible research? Well, over the past few decades over 100 papers have been written describing methods for estimating respiratory rate electronically from signals that are already monitored by wearable sensors. If you read them (it takes a long time) then you find that hundreds of methods have been described. The key questions are: which method is the best, and is it good enough to use in clinical practice? Answering these questions can be a daunting task given how many different methods there are. Very few of the methods are publicly available, so to answer these questions you’d have to implement each of the methods yourself. Even once you have done this, you’d need to try them out on some data. Collecting this data is no easy task. Altogether, reproducing scientist’s previous work on this problem is quite difficult.

I’m hoping that this won’t be such a problem in the future. We have recently implemented many of the methods, collected a benchmark dataset on which to test the methods, and reported the results. All of this is publicly available. What’s more, you can download it all for free, from the methods, to the data, to the article describing the results. So in a few clicks you can catch up, reproduce our research, and start making progress yourself, even producing methods like this:


Well, nearly … I’ve written a tutorial on the methods, which is due to be published in a textbook soon. This work can be reproduced exactly. Since then we have extended the range of publicly available resources by adding more methods, and the new benchmark dataset. This most recent work can’t be reproduced exactly since we had to make a few changes before making it publicly available. I intend to make future work on this topic fully reproducible so that researchers can build on our work. Who knows, perhaps this will contribute towards earlier identification of deteriorating patients in the future.

Many robotic arms make light work

David Lloyd_crop

Dr David Lloyd is a Clinical Research Fellow at King’s College London and working as part of the iFIND project. The overall aim of the intelligent Fetal Imaging and Diagnosis project is to combine innovative technologies into a clinical ultrasound system that will lead to a radical change in the way fetal screening is performed.

One of the goals of the iFIND project is to produce an antenatal screening system that uses multiple ultrasound probes at the same time. There are lots of potential advantages to this – for example, we could combine the images from two probes to see more of the baby at once, or provide a more detailed picture of one part of the baby. With iFIND, our hope is to have several separate 3D ultrasound probes working simultaneously, giving us the opportunity to see more of the baby, in more detail, than ever before.

The problem is, how do we control a number of ultrasound probes at the same time? I’ve yet to meet anyone who can scan with two probes at the same time, and several people trying to scan one patient sounds like a bit of a crowd! There is a solution though, and it’s something the team here at iFIND are working hard to develop: robotic arms.

Sounds pretty cool doesn’t it? Get robots to do the scans! But let’s stop and think about this for a minute. We need to make a robotic arm that can not just hold an ultrasound probe, but can twist, flex, rotate and extend, just like a human arm, to get all the views necessary to visualise the baby. Then we need to give it “eyes”: something to tell it not just what it is seeing now, but where and how to move to see other parts of the baby. It also needs to know exactly how hard to press, and we need to make sure it has thorough safety mechanisms built in. Perhaps it’s a tougher challenge than it sounds.

DavidJackie holding probes on phantom
David with Jackie Matthew, sonographer & research training fellow, each holding a probe to simultaneously scan a phantom fetus

However, as I’ve learnt, no problem is insurmountable for the team at iFIND, and indeed our dedicated robotics group are designing an arm that can do just that. The first step is to record in detail what humans do when they perform a scan, and that’s exactly what we do with our participants. Each dedicated iFIND ultrasound scan we perform records not only the imaging data, but also the exact position, force and torque (twisting force) of the ultrasound probe throughout the scan.

The video below shows an example: on the left, we can see how the sonographer is moving the ultrasound probe across the abdomen of one of our volunteers; the colours under the probe show how much pressure they applied to the skin. The right panel shows the ultrasound images so we know exactly what they could see at the time.


We hope to collect information from all 500 of our participants, and will use it to instruct the robotic arms how to perform the ultrasound scans automatically, just like a person would.

Another problem the team have to think about is far more simple, but perhaps just as important: aesthetics. The arms we design need to look and feel just as gentle and safe as we are designing them to be. So whilst we are collecting all the important data to help develop the technology, we are also learning from participants, just to ask how they would feel being scanned by a robotic arm rather than a person, and what we could do to make them more comfortable about the idea.

So: our goal is to produce a robotic arm that has the dexterity and sensitivity of a human being, knows how to perform a fetal ultrasound, well actually several of them, and doesn’t look scary.. And they also have to talk to each other.

Maybe we’ll leave that for another blog…

Read previous posts about the iFIND project written by David Lloyd.