出典(authority):フリー百科事典『ウィキペディア(Wikipedia)』「2013/06/21 23:06:06」(JST)
Fluoroscopy | |
---|---|
Intervention | |
A modern fluoroscope |
|
ICD-10-PCS | B?1 |
MeSH | D005471 |
Fluoroscopy is an imaging technique that uses X-rays to obtain real-time moving images of the internal structures of a patient through the use of a fluoroscope. In its simplest form, a fluoroscope consists of an X-ray source and fluorescent screen between which a patient is placed. However, modern fluoroscopes couple the screen to an X-ray image intensifier and CCD video camera allowing the images to be recorded and played on a monitor.
The use of X-rays, a form of ionizing radiation, requires the potential risks from a procedure to be carefully balanced with the benefits of the procedure to the patient. While physicians always try to use low dose rates during fluoroscopic procedures, the length of a typical procedure often results in a relatively high absorbed dose to the patient. Recent advances include the digitization of the images captured and flat panel detector systems which reduce the radiation dose to the patient still further.
Contents
|
The beginning of fluoroscopy can be traced back to 8 November 1895 when Wilhelm Röntgen noticed a barium platinocyanide screen fluorescing as a result of being exposed to what he would later call x-rays. Within months of this discovery, the first fluoroscopes were created. Early fluoroscopes were simply cardboard funnels, open at the narrow end for the eyes of the observer, while the wide end was closed with a thin cardboard piece that had been coated on the inside with a layer of fluorescent metal salt. The fluoroscopic image obtained in this way is rather faint. Thomas Edison quickly discovered that calcium tungstate screens produced brighter images and is credited with designing and producing the first commercially available fluoroscope. In its infancy, many incorrectly predicted that the moving images from fluoroscopy would completely replace the still x-ray radiographs, but the superior diagnostic quality of the earlier radiographs prevented this from occurring.
Ignorance of the harmful effects of x-rays resulted in the absence of standard radiation safety procedures which are employed today. Scientists and physicians would often place their hands directly in the x-ray beam resulting in radiation burns. Edison's assistant Clarence Madison Dally (1865–1904) died as a result of exposure to radiation from fluoroscopes, and in 1903, Edison abandoned his work on fluoroscopes, saying "Don't talk to me about X-rays, I am afraid of them.".[1] Trivial uses for the technology also resulted, including the shoe-fitting fluoroscope used by shoe stores in the 1930s-1950s.[2]
Due to the limited light produced from the fluorescent screens, early radiologists were required to sit in a darkened room, in which the procedure was to be performed, accustomizing their eyes to the dark and thereby increasing their sensitivity to the light. The placement of the radiologist behind the screen resulted in significant radiation doses to the radiologist. Red adaptation goggles were developed by Wilhelm Trendelenburg in 1916 to address the problem of dark adaptation of the eyes, previously studied by Antoine Beclere. The resulting red light from the goggles' filtration correctly sensitized the physician's eyes prior to the procedure while still allowing him to receive enough light to function normally.
The development of the X-ray image intensifier by Westinghouse in the late 1940s[3] in combination with closed circuit TV cameras in the 1950s revolutionized fluoroscopy. The red adaptation goggles became obsolete as image intensifiers allowed the light produced by the fluorescent screen to be amplified, allowing it to be seen even in a lighted room. The addition of the camera enabled viewing of the image on a monitor, allowing a radiologist to view the images in a separate room away from the risk of radiation exposure.
More modern improvements in screen phosphors, image intensifiers and even flat panel detectors have allowed for increased image quality while minimizing the radiation dose to the patient. Modern fluoroscopes use CsI screens and produce noise-limited images, ensuring that the minimal radiation dose results while still obtaining images of acceptable quality.
Because fluoroscopy involves the use of x-rays, a form of ionizing radiation, all fluoroscopic procedures pose a potential risk of radiation-induced cancer to the patient. Radiation doses to the patient depend greatly on the size of the patient as well as length of the procedure, with typical skin dose rates quoted as 20–50 mGy/min. Exposure times vary depending on the procedure being performed, but procedure times up to 75 minutes have been documented. Because of the long length of some procedures, in addition to the cancer risk and other stochastic radiation effects, deterministic radiation effects have also been observed ranging from mild erythema, equivalent of a sun burn, to more serious burns.
A study in 1994 of radiation induced skin injuries has been performed by the Food and Drug Administration (FDA)[4][5] followed by an advisory to minimize further fluoroscopy-induced injuries.[6] The problem of radiation injuries due to fluoroscopy has been further addressed in review articles in 2000[7] and 2010.[8]
While deterministic radiation effects are a possibility, radiation burns are not typical of standard fluoroscopic procedures. Most procedures sufficiently long in duration to produce radiation burns are part of necessary life-saving operations.
X-ray image intensifiers generally have radiation-reducing systems such as pulsed rather than constant radiation, and last image hold, which "freezes" the screen and makes it available for examination without exposing the patient to unnecessary radiation.[9]
The first fluoroscopes consisted of an x-ray source and fluorescent screen between which the patient would be placed. As the x-rays pass through the patient, they are attenuated by varying amounts as they interact with the different internal structures of the body, casting a shadow of the structures on the fluorescent screen. Images on the screen are produced as the unattenuated x rays interact with atoms in the screen through the photoelectric effect, giving their energy to the electrons. While much of the energy given to the electrons is dissipated as heat, a fraction of it is given off as visible light, producing the images. Early radiologists would adapt their eyes to view the dim fluoroscopic images by sitting in darkened rooms, or by wearing red adaptation goggles.
The invention of X-ray image intensifiers in the 1950s allowed the image on the screen to be visible under normal lighting conditions, as well as providing the option of recording the images with a conventional camera. Subsequent improvements included the coupling of, at first, video cameras and, later, CCD cameras to permit recording of moving images and electronic storage of still images.
Modern image intensifiers no longer use a separate fluorescent screen. Instead, a caesium iodide phosphor is deposited directly on the photocathode of the intensifier tube. On a typical general purpose system, the output image is approximately 105 times brighter than the input image. This brightness gain comprises a flux gain (amplification of photon number) and minification gain (concentration of photons from a large input screen onto a small output screen) each of approximately 100. This level of gain is sufficient that quantum noise, due to the limited number of x-ray photons, is a significant factor limiting image quality.
Image intensifiers are available with input diameters of up to 45 cm, and a resolution of approximately 2-3 line pairs mm−1.
The introduction of flat-panel detectors allows for the replacement of the image intensifier in fluoroscope design. Flat panel detectors offer increased sensitivity to X-rays, and therefore have the potential to reduce patient radiation dose. Temporal resolution is also improved over image intensifiers, reducing motion blurring. Contrast ratio is also improved over image intensifiers: flat-panel detectors are linear over a very wide latitude, whereas image intensifiers have a maximum contrast ratio of about 35:1. Spatial resolution is approximately equal, although an image intensifier operating in 'magnification' mode may be slightly better than a flat panel.
Flat panel detectors are considerably more expensive to purchase and repair than image intensifiers, so their uptake is primarily in specialties that require high-speed imaging, e.g., vascular imaging and cardiac catheterization.
A number of substances have been used as positive contrast agents: silver, bismuth, caesium, thorium, tin, zirconium, tantalum, tungsten and lanthanide compounds have been used as contrast agents. The use of thoria (thorium dioxide) as an agent was rapidly stopped as thorium causes liver cancer. Most modern injected radiographic positive contrast media are iodine-based. Iodinated contrast comes in two forms: ionic and non-ionic compounds. Non-ionic contrast is significantly more expensive than ionic (approximately three to five times the cost), however, non-ionic contrast tends to be safer for the patient, causing fewer allergic reactions and uncomfortable side effects such as hot sensations or flushing. Most imaging centers now use non-ionic contrast exclusively, finding that the benefits to patients outweigh the expense.
Negative radiographic contrast agents are air and carbon dioxide (CO2). The latter is easily absorbed by the body and causes less spasm. It can also be injected into the blood, where air absolutely cannot.
In addition to spatial blurring factors that plague all x-ray imaging devices, caused by such things as Lubberts effect, K-fluorescence reabsorption and electron range, fluoroscopic systems also experience temporal blurring due to system lag. This temporal blurring has the effect of averaging frames together. While this helps reduce noise in images with stationary objects, it creates motion blurring for moving objects. Temporal blurring also complicates measurements of system performance for fluoroscopic systems.
Another common procedure is the modified barium swallow study during which barium-impregnated liquids and solids are ingested by the patient. A radiologist records and, with a speech pathologist, interprets the resulting images to diagnose oral and pharyngeal swallowing dysfunction. Modified barium swallow studies are also used in studying normal swallow function.
Fluoroscopy can be used to examine the digestive system using a substance which is opaque to X-rays, (usually barium sulfate or gastrografin), which is introduced into the digestive system either by swallowing or as an enema. This is normally as part of a double contrast technique, using positive and negative contrast. Barium sulfate coats the walls of the digestive tract (positive contrast), which allows the shape of the digestive tract to be outlined as white or clear on an X-ray. Air may then be introduced (negative contrast), which looks black on the film. The barium meal is an example of a contrast agent swallowed to examine the upper digestive tract. Note that while soluble barium compounds are very toxic, the insoluble barium sulfate is non-toxic because its low solubility prevents the body from absorbing it.
In the medical information vernacular, "cine" refers to cineradiography which records 30 frames per second fluoroscopic images of internal organs such as the heart taken during injection of contrast dye to better visualize regions of stenosis, or to record motility in the body's gastrointestinal tract. The technology is being replaced with digital imaging systems that decrease the frame rate, but also decrease the absorbed dose of radiation to the patient[citation needed]
|
全文を閲覧するには購読必要です。 To read the full text you will need to subscribe.
リンク元 | 「fluoroscopic」「蛍光透視」 |
.