Sep 13, 2013
PET in motion: imaging awake subjects
PET is an essential tool for non-invasive imaging of the human brain and animal models of human brain disorders, providing unique information regarding brain function. If the subject moves during scanning, however, the resulting image may be blurred and its resolution degraded. Moreover, motion during dynamic PET studies, which track tracer uptake in brain tissue with time, can lead to image artefacts and biased parameter estimates.
To combat these deleterious effects, anaesthesia is generally employed to prevent motion – and eliminate stress – during small-animal imaging. Likewise, in PET studies of children and some adults, sedation and occasionally anaesthesia are used. Unfortunately, the anaesthetic itself interferes with physiological processes in the brain, as well as limiting the types of studies that can be performed.
Speaking at the 20th International Conference on Medical Physics, held last week in Brighton, UK, Steven Meikle explained how performing PET on conscious, freely moving subjects enables brain function to be studied during learning tasks and response to external stimuli. He also described some motion mitigation techniques that enable such "awake" PET scans.
"Motion can have serious effects, both on qualitative images and the quantitative parameters that we extract from these images," said Meikle, professor of medical imaging physics at the University of Sydney's Brain and Mind Research Institute in Australia. "There is an important need to solve the problem of motion correction, in both humans and animals. And we want to translate these techniques in both directions."
Meikle began by describing the use of tracking markers during PET scans. For human subjects, markers attached to the head have been employed to continuously monitor movements in 3D space via reflection of visible or infrared light, with better than 1 mm positional accuracy. Following the scan, PET data and motion data are synchronised and a motion-corrected image is reconstructed.
So can this strategy be transferred to small-animal imaging? Meikle's colleague Roger Fulton led the effort to answer this question. Their PhD student (now postdoc) Andre Kyme examined the motion of rats and mice and came up with a set of requirements for motion tracking of animals. Specifications include a positional accuracy of less than 0.5 mm and a sampling rate of at least 20 Hz.
These requirements are met by the MicronTracker stereo-optical motion tracking device (from Claron Technology), which tracks the position and orientation of three markers at up to 48 Hz, with 0.25 mm accuracy. The researchers used this system to record the motion of awake rats confined within a tube but free to move their heads. A small reflective marker was attached to the rat's forehead and two larger reference markers fixed to the gantry.
They performed PET measurements on the tube-bound rats with a microPET small-animal scanner and used a modified image reconstruction algorithm to account for the recorded motion. Comparisons of uncorrected and motion-corrected PET images revealed a marked qualitative improvement after correction. "We can obtain high-definition reconstructions using this motion correction approach," said Meikle.
He cited an example of a predictive learning experiment that could not be performed without motion correction. In this study, rats were trained to associate a particular tone with a reward of sugar water. PET scans were then recorded while the animals were played this tone and other sounds. The researchers saw distinct changes in FDG uptake in the thalamus and dorsal prefrontal cortex when the animals anticipated their sweet reward.
The next step is to extend this approach to imaging of freely moving animals. To achieve this, Meikle and colleagues built a robotic tracking system based on a microPET. Here, the animal is free to move in any direction within an enclosure. The animal's position is monitored using an optical motion tracking system, and a robotic arm controls the position of the chamber to keep its head within the scanner's field-of-view.
The team performed 20 minute scans on freely-moving rats. PET images reconstructed with motion compensation compared well to corresponding images from an anaesthetised animal. Meikle noted that this technique is still under development.
While marker-based tracking provides excellent motion compensation for PET studies, the need to attach the marker to the subject is a big limitation in both animal and human imaging. Instead, Kyme and colleagues are examining the possibility of less intrusive, markerless motion tracking. As well as considerably simplifying awake animal experiments, markerless tracking may improve the accuracy of motion measurements and extend the range of detectable motion. It also avoids the risk of markers being detached by the animals.
Markerless tracking works by using optical cameras to image the subject's motion and a scale-invariant feature transform (SIFT) algorithm to detect a large number of features in these images. The algorithm matches corresponding features within different images and determines motion data, which are then used to correct the PET images.
In a validation experiment, the researchers compared markerless and marker-based motion tracking of a rat's head. They found that the markerless approach provided a more accurate description of the motion, likely because it uses more tracking points. In a study on an awake rat, they saw good agreement between motion-corrected PET images and reference images of the anaesthetised animal.
The team is in the process of translating these methods to the clinic. A study on a human volunteer showed good agreement between markerless and marker-based motion tracking. "This gives us encouragement that the technique should work in clinical situations," said Meikle. He concluded that markerless motion tracking promises to simplify animal studies and is suitable for tracking human head motion, adding that a pilot study of paediatric PET patients is now underway.
About the author
Tami Freeman is editor of medicalphysicsweb.