出典(authority):フリー百科事典『ウィキペディア(Wikipedia)』「2014/11/16 23:41:50」(JST)
Fluoroscopy | |
---|---|
Intervention | |
A modern fluoroscope
|
|
ICD-10-PCS | B?1 |
MeSH | D005471 |
Fluoroscopy is an imaging technique that uses X-rays to obtain real-time moving images of the internal structures of a patient through the use of a fluoroscope. In its simplest form, a fluoroscope consists of an X-ray source and fluorescent screen between which a patient is placed. However, modern fluoroscopes couple the screen to an X-ray image intensifier and CCD video camera allowing the images to be recorded and played on a monitor.
The use of X-rays, a form of ionizing radiation, requires the potential risks from a procedure to be carefully balanced with the benefits of the procedure to the patient. While physicians always try to use low dose rates during fluoroscopic procedures, the length of a typical procedure often results in a relatively high absorbed dose to the patient. Recent advances include the digitization of the images captured and flat panel detector systems; modern advances allow further reduction of the radiation dose to the patient.
The beginning of fluoroscopy can be traced back to 8 November 1895 when Wilhelm Röntgen, or in English script Roentgen, noticed a barium platinocyanide screen fluorescing as a result of being exposed to what he would later call x-rays (algebraic x variable signifying "unknown"). Within months of this discovery, the first crude fluoroscopes were created. These experimental fluoroscopes were simply cardboard funnels, open at the narrow end for the eyes of the observer, while the wide end was closed with a thin cardboard piece that had been coated on the inside with a layer of fluorescent metal salt. The fluoroscopic image obtained in this way was quite faint. Even when finally improved and commercially introduced for diagnostic imaging, the limited light produced from the fluorescent screens of the earliest commercial scopes necessitated that a radiologist prior sat in the darkened room, where the imaging procedure was to be performed, to first accustom their eyes to increase their sensitivity to perceive light during the subsequent procedure. The placement of the radiologist behind the screen also resulted in significant dosing of the radiologist.
Red adaptation goggles were developed by Wilhelm Trendelenburg in 1916 to address the problem of dark adaptation of the eyes, previously studied by Antoine Beclere. The resulting red light from the goggles' filtration correctly sensitized the physician's eyes prior to the procedure, while still allowing him to receive enough light to function normally. The development of the X-ray image intensifier by Westinghouse in the late 1940s[1] in combination with closed circuit TV cameras of the 1950s revolutionized fluoroscopy. The red adaptation goggles became obsolete as image intensifiers allowed the light produced by the fluorescent screen to be amplified and made visible in a lighted room. The addition of the camera enabled viewing of the image on a monitor, allowing a radiologist to view the images in a separate room away from the risk of radiation exposure. More modern improvements in screen phosphors, image intensifiers and even flat panel detectors have allowed for increased image quality while minimizing the radiation dose to the patient. Modern fluoroscopes use CsI screens and produce noise-limited images, ensuring that the minimal radiation dose results while still obtaining images of acceptable quality.
Thomas Edison began investigating materials for ability to fluoresce when x-rayed in the late 1890s and by the turn of the century had invented a fluoroscope with sufficient image intensity to be commercialized. Edison had quickly discovered that calcium tungstate screens produced brighter images. Edison, however, abandoned his researches in 1903 because of the health hazards that accompanied use of these early devices. A glass blower of lab equipment and tubes at Edison’s laboratory was repeatedly exposed, suffering radiation poisoning and, later, succumbing to an aggressive cancer. Edison himself damaged an eye in testing these early fluoroscopes.[2]
During this infant commercial development, many incorrectly predicted that the moving images of fluoroscopy would completely replace roentgenographs or diagnostic radiograph still image films, but the then superior diagnostic quality of the roentgenograph and their already alluded safety enhancement of shorter radiation dose prevented this from occurring. More trivial uses of the technology also appeared in the 1930s-1950s, including a shoe-fitting fluoroscope used at shoe stores.[3]
Later, in the early 1960s, Frederick G. Weighart[4] and James F. McNulty[5] at Automation Industries, Inc., then, in El Segundo, California produced the world’s first image to be digitally generated in real-time on a fluoroscope, while developing a later commercialized portable apparatus for nondestructive testing of naval aircraft. Square wave signals were detected by the pixels of a cathode ray tube to create the image. Digital imaging technology was reintroduced to fluoroscopy after development of improved detector systems from the late 1980s.
Because fluoroscopy involves the use of x-rays, a form of ionizing radiation, all fluoroscopic procedures pose a potential high risk of radiation-induced cancer to the patient. Radiation doses to the patient depend greatly on the size of the patient as well as length of the procedure, with typical skin dose rates quoted as 20–50 mGy/min. Exposure times vary depending on the procedure being performed, but procedure times up to 75 minutes have been documented. Because of the long length of procedures, in addition to the cancer risk and other stochastic radiation effects, deterministic radiation effects have also been observed ranging from mild erythema, equivalent of a sun burn, to more serious burns.
A study of radiation induced skin injuries was performed in 1994 by the Food and Drug Administration (FDA)[6][7] followed by an advisory to minimize further fluoroscopy-induced injuries.[8] The problem of radiation injuries due to fluoroscopy has been further addressed in review articles in 2000[9] and 2010.[10]
While deterministic radiation effects are a possibility, radiation burns are not typical of standard fluoroscopic procedures. Most procedures sufficiently long in duration to produce radiation burns are part of necessary life-saving operations.
X-ray image intensifiers generally have radiation-reducing systems such as pulsed rather than constant radiation, and last image hold, which "freezes" the screen and makes it available for examination without exposing the patient to unnecessary radiation.[11]
The first fluoroscopes consisted of an x-ray source and fluorescent screen between which the patient would be placed. As the x-rays pass through the patient, they are attenuated by varying amounts as they interact with the different internal structures of the body, casting a shadow of the structures on the fluorescent screen. Images on the screen are produced as the unattenuated x rays interact with atoms in the screen through the photoelectric effect, giving their energy to the electrons. While much of the energy given to the electrons is dissipated as heat, a fraction of it is given off as visible light, producing the images. Early radiologists would adapt their eyes to view the dim fluoroscopic images by sitting in darkened rooms, or by wearing red adaptation goggles.
The invention of X-ray image intensifiers in the 1950s allowed the image on the screen to be visible under normal lighting conditions, as well as providing the option of recording the images with a conventional camera. Subsequent improvements included the coupling of, at first, video cameras and, later, CCD cameras to permit recording of moving images and electronic storage of still images.
Modern image intensifiers no longer use a separate fluorescent screen. Instead, a caesium iodide phosphor is deposited directly on the photocathode of the intensifier tube. On a typical general purpose system, the output image is approximately 105 times brighter than the input image. This brightness gain comprises a flux gain (amplification of photon number) and minification gain (concentration of photons from a large input screen onto a small output screen) each of approximately 100. This level of gain is sufficient that quantum noise, due to the limited number of x-ray photons, is a significant factor limiting image quality.
Image intensifiers are available with input diameters of up to 45 cm, and a resolution of approximately 2-3 line pairs mm−1.
The introduction of flat-panel detectors allows for the replacement of the image intensifier in fluoroscope design. Flat panel detectors offer increased sensitivity to X-rays, and therefore have the potential to reduce patient radiation dose. Temporal resolution is also improved over image intensifiers, reducing motion blurring. Contrast ratio is also improved over image intensifiers: flat-panel detectors are linear over a very wide latitude, whereas image intensifiers have a maximum contrast ratio of about 35:1. Spatial resolution is approximately equal, although an image intensifier operating in 'magnification' mode may be slightly better than a flat panel.
Flat panel detectors are considerably more expensive to purchase and repair than image intensifiers, so their uptake is primarily in specialties that require high-speed imaging, e.g., vascular imaging and cardiac catheterization.
A number of substances have been used as positive contrast agents: silver, bismuth, caesium, thorium, tin, zirconium, tantalum, tungsten and lanthanide compounds have been used as contrast agents. The use of thoria (thorium dioxide) as an agent was rapidly stopped as thorium causes liver cancer. Most modern injected radiographic positive contrast media are iodine-based. Iodinated contrast comes in two forms: ionic and non-ionic compounds. Non-ionic contrast is significantly more expensive than ionic (approximately three to five times the cost), however, non-ionic contrast tends to be safer for the patient, causing fewer allergic reactions and uncomfortable side effects such as hot sensations or flushing. Most imaging centers now use non-ionic contrast exclusively, finding that the benefits to patients outweigh the expense.
Negative radiographic contrast agents are air and carbon dioxide (CO2). The latter is easily absorbed by the body and causes less spasm. It can also be injected into the blood, where air absolutely cannot.
In addition to spatial blurring factors that plague all x-ray imaging devices, caused by such things as Lubberts effect, K-fluorescence reabsorption and electron range, fluoroscopic systems also experience temporal blurring due to system lag. This temporal blurring has the effect of averaging frames together. While this helps reduce noise in images with stationary objects, it creates motion blurring for moving objects. Temporal blurring also complicates measurements of system performance for fluoroscopic systems.
Another common procedure is the modified barium swallow study during which barium-impregnated liquids and solids are ingested by the patient. A radiologist records and, with a speech pathologist, interprets the resulting images to diagnose oral and pharyngeal swallowing dysfunction. Modified barium swallow studies are also used in studying normal swallow function.
Fluoroscopy can be used to examine the digestive system using a substance which is opaque to X-rays (usually barium sulfate or gastrografin), which is introduced into the digestive system either by swallowing or as an enema. This is normally as part of a double contrast technique, using positive and negative contrast. Barium sulfate coats the walls of the digestive tract (positive contrast), which allows the shape of the digestive tract to be outlined as white or clear on an X-ray. Air may then be introduced (negative contrast), which looks black on the film. The barium meal is an example of a contrast agent swallowed to examine the upper digestive tract. Note that while soluble barium compounds are very toxic, the insoluble barium sulfate is non-toxic because its low solubility prevents the body from absorbing it.
In the medical information vernacular, "cine" refers to cineradiography which records 30 frames per second fluoroscopic images of internal organs such as the heart taken during injection of contrast dye to better visualize regions of stenosis, or to record motility in the body's gastrointestinal tract. The technology is being replaced with digital imaging systems that decrease the frame rate, but also decrease the absorbed dose of radiation to the patient[citation needed]
Wikimedia Commons has media related to Fluoroscopy. |
Library resources about fluoroscopy |
|
|
全文を閲覧するには購読必要です。 To read the full text you will need to subscribe.
リンク元 | 「シネラジオグラフィ」「シネラジオグラフィー」 |
.