Taking part in a research study: a different perspective

Jenny Cook
Dr Jenny Cook

Dr Jenny Cook is a Research Associate at King’s College London studying the impact of engaging publics with health research.

I am a researcher in Public Engagement for King’ College London and the National Institute for Health Research Biomedical Research Centre, part of my job role is to promote taking part in clinical research to the general population.

So last month I decided to practice what I preach and took part in a research study at the Division of Imaging Sciences and Biomedical Imaging based over at St Thomas Hospital. The study is called the iFind project which stands for intelligent Fetal Imaging and Diagnosis.

The study, funded jointly by the Wellcome Trust and EPSRC aims to improve the accuracy of routine 18-20 week screening in pregnancy, by bringing together advanced ultrasound and magnetic resonance imaging (MRI) techniques, robotics and computer aided diagnostics.

So when I realised I was eligible to take part, for a couple of extra hospital visits, I thought it would be for a good cause!

I had worked previously with the iFind team to engage different audiences with the project as educational and interactive sessions using a pregnant tummy mannequin and the ultrasound machines for the King’s Health Partners Summer School and International Clinical Trials day.

I emailed them for more details and was put in touch with a very friendly Research Midwife who sent me over a patient information pack and some available dates.

I arrived at the Clinical Research Facility on a grey drizzly morning and was met by Josie, a friendly researcher who ran through some final consent and information forms with me. She reassured me about the practical details of the study, like who would be present and how long each part would last.

I went into the Ultrasound room and was met by three people; a research sonographer, one fetal cardiac clinician researcher and another working on the imaging robotics part of the project.

They talked me through the images they were collecting and explained what they meant and why they were important to the study. They also showed me how they can create the 3D images using the new software and at the end printed me out five pictures for me to take home.

Since taking part in iFind, I was also contacted to take part in another study, this time using Magnetic Resonance Imaging (MRI) to look at fetal brain development in the Developing Human Connectome Project. The aim of this study is to map the baby’s brain development before and after birth to understand better how the brain grows and how problems may arise.

This involved coming into St Thomas’s post-natal scanning department and spending about 60 minutes inside a big MRI machine. I have to admit, it was quite noisy and cramped in there, but the imaging team were fantastic and reassuring. I had music in my headphones, plenty of pillows and came out a couple of times for a quick break. After the scan, Laura from the team went through my images with me and showed me a video of my baby moving in my stomach and the different parts of her brain. They also sent me links to the images, so I can keep them.

babyjenny
A saggital view of Jenny’s baby’s brain

The results of these images and scans in the future will contribute to a database of images that will help research. By understanding the potential benefits of using imaging and detecting more problems before birth, they hope to provide better information to parents and their doctors, and allow babies to get access to the treatments they need as soon as possible after they are born.


If you are interested in participating in the iFind project or any other fetal studies please contact: gst-tr.fetalbookings@nhs.net.
For further information on the iFIND study please contact: iFIND@gstt.nhs.uk

Advertisements

Many robotic arms make light work

David Lloyd_crop
David

Dr David Lloyd is a Clinical Research Fellow at King’s College London and working as part of the iFIND project. The overall aim of the intelligent Fetal Imaging and Diagnosis project is to combine innovative technologies into a clinical ultrasound system that will lead to a radical change in the way fetal screening is performed.

One of the goals of the iFIND project is to produce an antenatal screening system that uses multiple ultrasound probes at the same time. There are lots of potential advantages to this – for example, we could combine the images from two probes to see more of the baby at once, or provide a more detailed picture of one part of the baby. With iFIND, our hope is to have several separate 3D ultrasound probes working simultaneously, giving us the opportunity to see more of the baby, in more detail, than ever before.

The problem is, how do we control a number of ultrasound probes at the same time? I’ve yet to meet anyone who can scan with two probes at the same time, and several people trying to scan one patient sounds like a bit of a crowd! There is a solution though, and it’s something the team here at iFIND are working hard to develop: robotic arms.

Sounds pretty cool doesn’t it? Get robots to do the scans! But let’s stop and think about this for a minute. We need to make a robotic arm that can not just hold an ultrasound probe, but can twist, flex, rotate and extend, just like a human arm, to get all the views necessary to visualise the baby. Then we need to give it “eyes”: something to tell it not just what it is seeing now, but where and how to move to see other parts of the baby. It also needs to know exactly how hard to press, and we need to make sure it has thorough safety mechanisms built in. Perhaps it’s a tougher challenge than it sounds.

DavidJackie holding probes on phantom
David with Jackie Matthew, sonographer & research training fellow, each holding a probe to simultaneously scan a phantom fetus

However, as I’ve learnt, no problem is insurmountable for the team at iFIND, and indeed our dedicated robotics group are designing an arm that can do just that. The first step is to record in detail what humans do when they perform a scan, and that’s exactly what we do with our participants. Each dedicated iFIND ultrasound scan we perform records not only the imaging data, but also the exact position, force and torque (twisting force) of the ultrasound probe throughout the scan.

The video below shows an example: on the left, we can see how the sonographer is moving the ultrasound probe across the abdomen of one of our volunteers; the colours under the probe show how much pressure they applied to the skin. The right panel shows the ultrasound images so we know exactly what they could see at the time.

force_map_us

We hope to collect information from all 500 of our participants, and will use it to instruct the robotic arms how to perform the ultrasound scans automatically, just like a person would.

Another problem the team have to think about is far more simple, but perhaps just as important: aesthetics. The arms we design need to look and feel just as gentle and safe as we are designing them to be. So whilst we are collecting all the important data to help develop the technology, we are also learning from participants, just to ask how they would feel being scanned by a robotic arm rather than a person, and what we could do to make them more comfortable about the idea.

So: our goal is to produce a robotic arm that has the dexterity and sensitivity of a human being, knows how to perform a fetal ultrasound, well actually several of them, and doesn’t look scary.. And they also have to talk to each other.

Maybe we’ll leave that for another blog…

Read previous posts about the iFIND project written by David Lloyd.

Moving scenes

Dr David Lloyd is a Clinical Research Fellow at King’s College London and working as part of the iFIND project. The overall aim of the intelligent Fetal Imaging and Diagnosis project is to combine innovative technologies into a clinical ultrasound system that will lead to a radical change in the way fetal screening is performed.

David
David

One of the most important goals of the iFIND project is to build an “atlas” of the fetus: a comprehensive map of fetal anatomy at around 20 weeks gestation (when routine antenatal scans are performed). This means getting the best quality images that we can, from as many women as we can – but as I’m learning, taking pictures of a 20 week fetus while they’re still in the womb really isn’t that easy.

For one thing, they’re very (very) small. The fetal heart, for example, with all of its tiny chambers and valves, is only about 15mm long: less than the size of penny. Ultrasound technology – used in all routine antenatal scans in the UK – is actually fairly good at visualising these tiny structures. It uses very high frequency sound waves which are reflected back (“echo”) from the structures inside the body to produce an image. In fetal ultrasound, the images produced can be excellent; but unfortunately that’s not true for every patient. Ultrasound has to be able to “see” through the body to the parts of the baby we want to image, and that isn’t always easy. It will depend on the age of the baby, how they are lying in the womb, the size of the mother, and many other factors.

MRI, which uses a strong magnetic field and radio waves to produce images, isn’t so limited. It can see the structures inside the body regardless of whether there’s bone, muscle or fat in the way; and in some cases it can give us even more detailed images than ultrasound. Importantly, it is also one of the few imaging techniques that is safe to use in pregnancy. The problem? MRI isn’t great with small, moving targets – like we see in the fetus.

So that’s why we ask our iFIND volunteers to have both an ultrasound and an MRI scan. By combining the strengths of these two technologies, we hope to get the best of both worlds to produce the most accurate fetal atlas we can.

blue_fetuses
Of course though, even that isn’t quite so simple. Fetal movements – like twisting, rolling, stretching and kicking – are a particularly tricky problem, even when we use both technologies together .

Watch this MRI clip from one of our volunteers. Unfortunately there’s not much you can do when your patient decides to start breakdancing half way through a scan! At least, you’d think there wasn’t… but amazingly even that may not be an insurmountable problem. In the last few months I’ve been involved with some of the work of Bernhard Kainz and his colleagues, who have devised clever algorithms to automatically correct for small fetal movements during MRI and produce usable images.

These techniques show a huge amount of potential, and are an example of how the iFIND project is helping to generate exciting new technologies on its way to the ultimate goal: to improve the way we see developing babies in the womb.

Read previous posts about the iFIND project written by David Lloyd.

New technology to improve 20 week ultrasound scan

Dr David Lloyd is a Clinical Research Fellow at King’s College London and has recently joined a multi-disciplinary team all working together on an exciting new project called iFIND.

David
David

The last two months have been some of the most interesting and exciting in my career, leaving my clinical post as a doctor in paediatric cardiology to start a PhD and join an exciting and ambitious project called iFIND, based at King’s and funded by the Wellcome Trust and the Engineering and Physical Sciences Research Council. The aim of the project is to produce a fully automated system to replace the 20-week ultrasound scans that are currently routine for all pregnant women in the UK. These scans are performed by experienced, highly trained sonographers, but unfortunately detecting every problem is just not possible – in fact only around half of all congenital abnormalities are picked up in this way.

3D ultrasound
3D ultrasound

So the project is a revolutionary one that could make a huge difference – but then, when you really start to think about it, replacing humans in this process is a pretty daunting task. Can we really make machines that are so dexterous and sensitive that they can perform ultrasound scans on pregnant mothers? What do humans really “see” when we are looking at ultrasound images anyway? Our brains are incredibly adept at recognising and interpreting visual patterns, in ways we don’t yet fully understand – can we really teach a computer to see in the same way as a person? Can we then teach them to recognise what is normal and not normal? Unsurprisingly, the team at iFIND think the answer to all those questions is – or will be – a resounding yes, but there’s still a huge amount of hard work to do to make it all a reality.

And the truth is that there’s very little you’d put beyond the reach of the iFIND team. My last three years working as a doctor looking after children in hospital with congenital heart disease has been a world of non-stop pagers, busy hospital wards and outpatient clinics, so the first few weeks here were a bit of a culture-shock; a whirlwind not just of new faces, but also of new ideas and new technologies. Whether I’m watching over the shoulder of my colleague Alberto wearing his 3D glasses and firing up his new holographic display, or staring blankly at Josh bamboozling me with a discourse on proton spins and the complex physics of MRI, or seeing James and the robotics team demonstrating the motion and pressure sensors that collect the data that will eventually inform the robotic arms that will perform the ultrasound scans. It’s continually inspiring to be working amongst such a motivated, passionate and intelligent group of people.

MRI scan of a fetus
MRI scan of a fetus

Personally, my particular clinical interest is in one of the fastest growing fields within cardiology – fetal cardiology, so the project suits me well. But what difference does it actually make, picking up a heart, limb, lung or brain abnormality in a baby that is not yet born? Actually, this can be extremely beneficial —it gives us the opportunity to spend time with parents and explain exactly what the diagnosis means for their baby; it means we can plan ahead and put everything in place to keep them healthy once they are born; and ultimately it means we can give them the support they need to grow and develop as early as possible. In rare circumstances we can even perform keyhole-type interventional procedures on the fetus if needed.

So my new role has been challenging, eye opening, and ultimately inspiring. Being part of such diverse and dynamic team working on such an important project is a real privilege; not just that sense that we are all working together towards a common goal, but also watching the new technologies that are being created along the way, bringing real benefits to real patients. And perhaps that’s the most inspiring thing of all – seeing the faces on prospective parents as they watch images of their unborn baby for the first time, so generously giving their time to help us make a difference for the families of the future.

This work was supported by the Wellcome Trust and EPSRC, Innovative Engineering for Health Award [102431].

The author acknowledges financial support from the Department of Health via the National Institute for Health Research comprehensive Biomedical Research Centre award to Guy’s & St Thomas’ NHS Foundation Trust in partnership with King’s College London and King’s College Hospital NHS Foundation Trust.