iFind Blog No. 5: A Little Reflection

David Lloyd_crop

Last Friday, our 265th iFind volunteer arrived for her ultrasound with her partner and their chirpy two-year old daughter in tow. Once the scan started the little girl sat quietly with her father, slightly disinterested, even when he pointed out the face of her soon-to-be baby sister on the ultrasound screen. It wasn’t until we pulled open the blinds at the end of the scan, revealing the River Thames glistening between Westminster Palace and London Eye, that her apathy finally subsided and she bounced up and down with glee.

Why am I telling you this? So two-year olds don’t care much for ultrasound scans… got it. Well, there’s a little more to this story. You see, that little girl had been in this room before – and she definitely didn’t have the same response when the blinds went up the last time. Her mother (iFind participant number two hundred and sixty-five) was actually volunteering for iFind for the second time, having previously been iFind participant number… five. Yup, that excitable little two-year old had once been the unborn face we glimpsed on the ultrasound machine, right back at the very beginning of our project.

ifind blog pic
Reflecting on the iFind project (pun by David, not me – Ed)

Cool right? We thought so too. In fact, it was a pretty fascinating reminder of how much time has passed since we started over three years ago, and a good opportunity to reflect on how far we’ve come since then (which, it turns out, is a long way). But more on that later. Firstly, a reminder of where we’re going: we are developing a fully integrated system of robotics, ultrasound, and computing that can automatically acquire and analyse the diagnostic imaging needed to detect fetal abnormalities. Yes, it’s ambitious; and no, we’re not (quite) there yet. But working with iFind is a bit like a mission to the moon: it will be exciting when we get there, but it’s just as exciting watching the incredible new technologies we are developing along the way.

OK then, what have we achieved? Well, for a start, we have built on the routine screening ultrasounds from thousands of our iFind 1 volunteers to help inform this incredible automatic image detection software, designed by my colleagues Christian Baumgartner and Bernard Kainz using clever machine learning algorithms. My clinical colleagues Caroline Knight, Jackie Matthew and Tara Fletcher are already investigating the ways in which this might be useful to the teams performing routine antenatal screening scans, and by combining this software with the detailed tracking data from our iFind 2 volunteers – as demonstrated in this video from my colleagues Nicolas Toussaint and Alberto Gomez – we are starting to develop the backbone of the software that will guide the robotic elements of the final system (which, as my colleagues James Housden, Yohan Noh, Shuangyi Wang and Davi Singh will tell you, is amazing… but also top secret – for now. Sorry).

From the MRI side of things, iFind volunteer 265 means we’re now over half way to our target of 500 complete fetal MRI scans, providing detailed imaging of the fetus to develop our final “atlas” of normal fetal development, being generated by Tong Zhang. We have seen some incredible yoga moves, some (technically flawless) moonwalking… and some babies just asking us to keep the noise down up there. In the last few months Jackie also completed her research looking at fetal weight, and we can now routinely report the fetal volume and weight in some of our MRI scans. In the area I am interested in – the fetal heart – Josh van Amerom has published these excitingly detailed video loops of the fetal cardiac motion, with more developments in the pipeline. And finally, by using MRI with motion correction techniques, my own research is already providing useful diagnostic information for real patients at St Thomas’ hospital, helping our clinical colleagues to provide the highest standards of care.

So there you have it. 265 patients in and we’ve come a long, long way. It’s thrilling to think where we might be in another three years. When I was told that I may have scanned that little girl before she was born, I joked, “Ah! I thought I recognised her”. I didn’t, of course. But the way things are going with iFind, one day… who knows?


Taking part in a research study: a different perspective

Jenny Cook
Dr Jenny Cook

Dr Jenny Cook is a Research Associate at King’s College London studying the impact of engaging publics with health research.

I am a researcher in Public Engagement for King’ College London and the National Institute for Health Research Biomedical Research Centre, part of my job role is to promote taking part in clinical research to the general population.

So last month I decided to practice what I preach and took part in a research study at the Division of Imaging Sciences and Biomedical Imaging based over at St Thomas Hospital. The study is called the iFind project which stands for intelligent Fetal Imaging and Diagnosis.

The study, funded jointly by the Wellcome Trust and EPSRC aims to improve the accuracy of routine 18-20 week screening in pregnancy, by bringing together advanced ultrasound and magnetic resonance imaging (MRI) techniques, robotics and computer aided diagnostics.

So when I realised I was eligible to take part, for a couple of extra hospital visits, I thought it would be for a good cause!

I had worked previously with the iFind team to engage different audiences with the project as educational and interactive sessions using a pregnant tummy mannequin and the ultrasound machines for the King’s Health Partners Summer School and International Clinical Trials day.

I emailed them for more details and was put in touch with a very friendly Research Midwife who sent me over a patient information pack and some available dates.

I arrived at the Clinical Research Facility on a grey drizzly morning and was met by Josie, a friendly researcher who ran through some final consent and information forms with me. She reassured me about the practical details of the study, like who would be present and how long each part would last.

I went into the Ultrasound room and was met by three people; a research sonographer, one fetal cardiac clinician researcher and another working on the imaging robotics part of the project.

They talked me through the images they were collecting and explained what they meant and why they were important to the study. They also showed me how they can create the 3D images using the new software and at the end printed me out five pictures for me to take home.

Since taking part in iFind, I was also contacted to take part in another study, this time using Magnetic Resonance Imaging (MRI) to look at fetal brain development in the Developing Human Connectome Project. The aim of this study is to map the baby’s brain development before and after birth to understand better how the brain grows and how problems may arise.

This involved coming into St Thomas’s post-natal scanning department and spending about 60 minutes inside a big MRI machine. I have to admit, it was quite noisy and cramped in there, but the imaging team were fantastic and reassuring. I had music in my headphones, plenty of pillows and came out a couple of times for a quick break. After the scan, Laura from the team went through my images with me and showed me a video of my baby moving in my stomach and the different parts of her brain. They also sent me links to the images, so I can keep them.

A saggital view of Jenny’s baby’s brain

The results of these images and scans in the future will contribute to a database of images that will help research. By understanding the potential benefits of using imaging and detecting more problems before birth, they hope to provide better information to parents and their doctors, and allow babies to get access to the treatments they need as soon as possible after they are born.

If you are interested in participating in the iFind project or any other fetal studies please contact: gst-tr.fetalbookings@nhs.net.
For further information on the iFIND study please contact: iFIND@gstt.nhs.uk

The Weighting Game

David Lloyd_crop

One of the most important things to get right when imaging an unborn baby is the fetal weight. How do we know the fetus is growing as it should be unless we know how big they are? How can we know that the heart, or brain, or lungs, are developing normally, unless we can compare them to the rest of the body?

Unfortunately, it isn’t so easy as just popping the baby on a set of scales. They are floating in water, attached to a placenta via a long umbilical cord, surrounded by a muscular womb, and – oh yeah! – their mother. So any time we guess how much the fetus weighs, it is just that: a guess.

The most commonly used formula to estimate the fetal weight was developed in the 1980s by Dr Francis Hadlock, using ultrasound to measure the head, abdomen and thighs, and guessing the weight of the baby from that. This method can actually be pretty inaccurate – for example, we know that ultrasound can be a bit blurry, doesn’t define the edge of bones very well, and depends on finding exactly the right angles to measure which might not always be possible. Amazingly though, we’ve not been able to find a better way since then; almost every routine scan in the UK will use this method. As my fetal medicine colleague Jackie Matthew put it: “a lot of people think it’s just down to how good the sonographer is – but it’s really not that simple”. Now though, as part of the iFind project, we are working on new ways of estimating the fetal weight which we hope will be far more accurate.

Zhang & Davidson’s 3D rendering of a fetal MRI

When each of our iFind 2 volunteers attends for an extra ultrasound and MRI scan, we use these to build a three-dimensional “atlas” of the fetus, which will form the foundation for the technologies we develop to screen for fetal abnormalities. Being able to see the baby in “3D” like this is one of the jobs of my colleagues Tong Zhang and Alice Davidson – the latter of whom produced this beautiful rendering of a fetus from an MRI scan. But this image doesn’t just look amazing: it also means we could have a much more accurate way of guessing the baby’s weight than a few blurry ultrasound measurements. Knowing how much space the baby takes up in three dimensions – the fetal volume – means we could potentially estimate far more precisely whether the baby is growing normally.

So that’s a win right? Go iFind! Well… not quite. Unfortunately it’s still not that simple – and that’s where Jackie comes in. Her research has some difficult questions to answer: exactly how inaccurate are these ultrasound techniques? How do we do know? Is MRI really better? How do we prove it? And how does a fetal volume equate to a fetal weight? Is it the same through all nine months of pregnancy?

These are tough questions, but like everyone else at iFind, Jackie is determined to find answers. And when she does, it’s these types of new discoveries that should help iFind get closer to its ultimate goal: using new technologies to improve how we see and understand life before birth.


Dr David Lloyd is a Clinical Research Fellow at King’s College London and working as part of the iFIND project. The overall aim of the intelligent Fetal Imaging and Diagnosis project is to combine innovative technologies into a clinical ultrasound system that will lead to a radical change in the way fetal screening is performed.


Many robotic arms make light work

David Lloyd_crop

Dr David Lloyd is a Clinical Research Fellow at King’s College London and working as part of the iFIND project. The overall aim of the intelligent Fetal Imaging and Diagnosis project is to combine innovative technologies into a clinical ultrasound system that will lead to a radical change in the way fetal screening is performed.

One of the goals of the iFIND project is to produce an antenatal screening system that uses multiple ultrasound probes at the same time. There are lots of potential advantages to this – for example, we could combine the images from two probes to see more of the baby at once, or provide a more detailed picture of one part of the baby. With iFIND, our hope is to have several separate 3D ultrasound probes working simultaneously, giving us the opportunity to see more of the baby, in more detail, than ever before.

The problem is, how do we control a number of ultrasound probes at the same time? I’ve yet to meet anyone who can scan with two probes at the same time, and several people trying to scan one patient sounds like a bit of a crowd! There is a solution though, and it’s something the team here at iFIND are working hard to develop: robotic arms.

Sounds pretty cool doesn’t it? Get robots to do the scans! But let’s stop and think about this for a minute. We need to make a robotic arm that can not just hold an ultrasound probe, but can twist, flex, rotate and extend, just like a human arm, to get all the views necessary to visualise the baby. Then we need to give it “eyes”: something to tell it not just what it is seeing now, but where and how to move to see other parts of the baby. It also needs to know exactly how hard to press, and we need to make sure it has thorough safety mechanisms built in. Perhaps it’s a tougher challenge than it sounds.

DavidJackie holding probes on phantom
David with Jackie Matthew, sonographer & research training fellow, each holding a probe to simultaneously scan a phantom fetus

However, as I’ve learnt, no problem is insurmountable for the team at iFIND, and indeed our dedicated robotics group are designing an arm that can do just that. The first step is to record in detail what humans do when they perform a scan, and that’s exactly what we do with our participants. Each dedicated iFIND ultrasound scan we perform records not only the imaging data, but also the exact position, force and torque (twisting force) of the ultrasound probe throughout the scan.

The video below shows an example: on the left, we can see how the sonographer is moving the ultrasound probe across the abdomen of one of our volunteers; the colours under the probe show how much pressure they applied to the skin. The right panel shows the ultrasound images so we know exactly what they could see at the time.


We hope to collect information from all 500 of our participants, and will use it to instruct the robotic arms how to perform the ultrasound scans automatically, just like a person would.

Another problem the team have to think about is far more simple, but perhaps just as important: aesthetics. The arms we design need to look and feel just as gentle and safe as we are designing them to be. So whilst we are collecting all the important data to help develop the technology, we are also learning from participants, just to ask how they would feel being scanned by a robotic arm rather than a person, and what we could do to make them more comfortable about the idea.

So: our goal is to produce a robotic arm that has the dexterity and sensitivity of a human being, knows how to perform a fetal ultrasound, well actually several of them, and doesn’t look scary.. And they also have to talk to each other.

Maybe we’ll leave that for another blog…

Read previous posts about the iFIND project written by David Lloyd.

Moving scenes

Dr David Lloyd is a Clinical Research Fellow at King’s College London and working as part of the iFIND project. The overall aim of the intelligent Fetal Imaging and Diagnosis project is to combine innovative technologies into a clinical ultrasound system that will lead to a radical change in the way fetal screening is performed.


One of the most important goals of the iFIND project is to build an “atlas” of the fetus: a comprehensive map of fetal anatomy at around 20 weeks gestation (when routine antenatal scans are performed). This means getting the best quality images that we can, from as many women as we can – but as I’m learning, taking pictures of a 20 week fetus while they’re still in the womb really isn’t that easy.

For one thing, they’re very (very) small. The fetal heart, for example, with all of its tiny chambers and valves, is only about 15mm long: less than the size of penny. Ultrasound technology – used in all routine antenatal scans in the UK – is actually fairly good at visualising these tiny structures. It uses very high frequency sound waves which are reflected back (“echo”) from the structures inside the body to produce an image. In fetal ultrasound, the images produced can be excellent; but unfortunately that’s not true for every patient. Ultrasound has to be able to “see” through the body to the parts of the baby we want to image, and that isn’t always easy. It will depend on the age of the baby, how they are lying in the womb, the size of the mother, and many other factors.

MRI, which uses a strong magnetic field and radio waves to produce images, isn’t so limited. It can see the structures inside the body regardless of whether there’s bone, muscle or fat in the way; and in some cases it can give us even more detailed images than ultrasound. Importantly, it is also one of the few imaging techniques that is safe to use in pregnancy. The problem? MRI isn’t great with small, moving targets – like we see in the fetus.

So that’s why we ask our iFIND volunteers to have both an ultrasound and an MRI scan. By combining the strengths of these two technologies, we hope to get the best of both worlds to produce the most accurate fetal atlas we can.

Of course though, even that isn’t quite so simple. Fetal movements – like twisting, rolling, stretching and kicking – are a particularly tricky problem, even when we use both technologies together .

Watch this MRI clip from one of our volunteers. Unfortunately there’s not much you can do when your patient decides to start breakdancing half way through a scan! At least, you’d think there wasn’t… but amazingly even that may not be an insurmountable problem. In the last few months I’ve been involved with some of the work of Bernhard Kainz and his colleagues, who have devised clever algorithms to automatically correct for small fetal movements during MRI and produce usable images.

These techniques show a huge amount of potential, and are an example of how the iFIND project is helping to generate exciting new technologies on its way to the ultimate goal: to improve the way we see developing babies in the womb.

Read previous posts about the iFIND project written by David Lloyd.