“Helping educational programs make digital radiography easy to understand.”

Digital in the Curriculum: How Much, How Deep?

Quinn B. Carroll, MEd, RT

2019 West Coast Educators Conference, Orlando, FL

 

 

We are adjusting to a new PARADIGM:

 

[Paradigm: Assumption/pattern taken so for granted that it forms a framework for our thinking]

 

First, digital is not a footnote to our curriculum that can be addressed by adding units

to an existing course, it is a whole new course

-How many credits?  I recommend at least 2.

-Second, it impacts every other course in the curriculum including introduction, math review, etc.

                -We should reexamine every course, every chapter

It’s not unmanageable, it’s just new!

 

  1. REVAMPING THE “HIERARCHY” OF IMAGE QUALITEIS

 

-D. Donahue 1970s: 2 X 4 categories

-G. Cummings 1990s: 2 X 4 categories

-Per Standard Definitions 2017:  4 categories:  DISTORTION = Shape & Size Distortion

-The KID Test: Magnification    -Ask the kids:

6-year old: “What is a magnifying glass?” Vs. “What is a size distortion glass?”

 

-Based on Std Def’s:

-In this scheme, where do we place quantum mottle, fog, grid lines, artifacts or electronic noise?

WHY WE NEED NOISE: – A case study of why the conventional Standard Definitions is many years overdue for a complete make-over

-Physicists have emphasized SNR for years

-Neither term, noise or SNR, is found in the Standard Definitions

They are in the ASRT Curriculm Guide

More important now, with digital imaging (e.g., prevalence of mottle) BUT…

 

-Noise is more than just Mottle:

-If one had to choose, mottle and fog both fit best under “contrast” as demonstrated

-Physicists use low-contrast dots to measure the level of mottle present

-However, this still doesn’t account for grid lies and many other artifacts that also

constitute forms of noise

 

How should we define noise?  c/o Dr. Anthony Wolbarst “The Physics of Radiology”:

“Anything which obstructs visibility of image details”

 

-20 Types!  In 8 categories:

Quotes from Dr. Anthony Wolbarst’s “The Physics of Radiology”:

 

– “From a broader perspective, noise is anything in an image that

detracts from its clinical usefulness”

 

– “Imaging detector devices generate weak electric signals … and the

number of electrons actually involved at any instant will vary

about the average according to Poisson statistics. Sometimes

more noticeable is the noise injected into the sensitive

detector electronics by external sources, such as lightening

storms,                 sparking of machinery, and some fluorescent lamps”

 

– “Imaging devices can contribute to noise in a variety of ways. Imperfections in computer reconstruction algorithms … may lead to abnormalities in images produced”

 

-Proposal for NOISE category, a VISIBILITY FACTOR

*ALL negative visibility factors fall under one column: Noise

 

-Proposed general categories

-My categories

Re: “Spatial Resolution”:

-From 2017 Std Definitions: “The sharpness of the structural edges recorded in the image”

 

-For 3 decades, I taught “sharpness” throughout the program, then, in Registry Review told my graduating students that, by the way, on the exam the ARRT will call it “recorded detail”

 

-Now I recommend teaching “sharpness” throughout the program, then tell your graduating students that, by the way, on the exam the ARRT will call it “spatial resolution”

 

-See How Digital Has Changed Spatial Resolution at ASRT LiveOnline for a full discussion

 

The KID Test: Sharpness -Ask the kids …

10-year old: “Is this image sharp or blurry?” Vs. “Is this image spatially resolved?”

 

Physicists have no problem using the term “Sharpness”

 

-The categorization problem with using “spatial resolution” for “sharpness” –Spatial Res repeats:

 

-Note that we are adopting the physics term “spatial resolution” while shunning the physics

terms “noise” and “SNR”

 

-Example of attempt to force new wine into old bottles:

Attempting to force new wine into old bottles:

Trying to equate digital terms with old film/screen terms: “Dynamic Range, Receptor Contrast”

-If the concept of dynamic range is restricted to the “detector” or image receptor, then how do

we teach dynamic range compression or dynamic range control ?

-DRC is a computer postprocessing function

-Changing it at the IR would require going back in time

 

-“Receptor Contrast” = Appears to be an invented term, but worse:

“Receptor Contrast”: “A characteristic of the receptor, linear, impacted by, limited by ..”

                                -NO ACTUAL DEFINITION AT ALL!

 

 

  1. ORGANIZING THE CURRICULUM BY VARIABLES INSTEAD OF IMAGE QUALITIES

 

-Ockham’s Razor: “Entities must not needlessly be multiplied”  -Sir William of Ockham

Curriculum Organized by Image Qualities:

Density/Brightness Contrast/GS Sharpness/SR     Distortion
1. mAs / mA/ s 1. kVp 1. Focal Spot     1. Alignment
2. kVp 2. Generator 2. SID     2. SID
3. Generator 3. Filtration 3. OID     3. OID
4. Filtration 4. Field Size 4. Time/Motion
5. Field Size 5. Patient Status
6. Patient Status 6. Contrast Agents
7. Contrast Agents 7. Grids
8. Grids 8. OID
9. SID
10. OID

~  ADD DIGITAL

11. Rescaling 9. Rescaling 5. Detail Processing
12. LUTs 10. LUTs
13. Windowing 11. DRC
12. Windowing
13. Detail Proc’g

Curriculum Organized by Variables: Same 5 or 6 questions posed in every unit

1. mAs: Brightness/Density 4. Windowing: Brightness/Density
Gray Scale/Contrast Gray Scale/Contrast
Noise Noise
Sp. Res./Sharpness Sp. Res./Sharpness
Distortion Distortion
2. kVp: Brightness/Density 5. Rescaling: Brightness/Density
Gray Scale/Contrast Gray Scale/Contrast
Noise Noise
Sp. Res./Sharpness Sp. Res./Sharpness
Distortion Distortion
3. Field Size: Brightness/Density 6. LUTs: Brightness/Density
Gray Scale/Contrast Gray Scale/Contrast
Noise Noise
Sp. Res./Sharpness Sp. Res./Sharpness
Distortion Distortion

 

  1. HOW DEEP, AND FITTING IT ALL IN: WHAT SHOULD OUR FOCUS BE?

 

IMPACT OF DIGITAL TECHNOLOGY IS SUBSTANTIAL:

-Implications for radiographic TECHNIQUE

-e.g., new role for kVp

-Implications for saving PATIENT DOSE

-Virtual grid software eliminates grids in many cases

-Increased exposure latitude for high kVp, filters, grids

-The “how” of digital processing directly relates to OPERATOR ADJUSTMENTS to the displayed image

-Many “features” are re-applications of processing fxs

-Display monitor characteristics determine DIAGNOSTIC IMAGE QUALITY

-THESE ARE ALL CLINICALLY RELEVANT ISSUES-

 

2017 ASRT CURRICULUM GUIDE for RADIOGRAPHY:

ELECTRONIC IMAGE DISPLAY & VIEWING

DIGITAL IMAGE ACQUSITION AND DISPLAY

A. Monitor FILM Equivalent
1. Characteristics = Structure, Crystals
2. Care and maintenance = Storage
3. Quality control = Sensitometry
B. Viewing conditions
1. Ambient lighting = Viewbox (Illuminator)
2. Viewing angle = Densitometer
C. Hard copy (i.e., laser film)

-For film, we had to know its basic structure and function, including chemical reduction & oxidation

of silver bromide molecules during processing

  • Some considered memorization of chemical names as going too far

 

-Equivalent understanding of the LCD monitor is to know its basic structure and function, including

the polarization of light and how electric charge is used to create different levels of brightness

  • Requiring the memorization of the specific chemicals used for the electrical conductors and nematic crystals is going too far

 

 

IMAGE ANALYSIS

III. Image Appearance Characteristics

  1. Brightness
  2. Noise <-
    1. Random (e.g., quantum mottle, scatter)
    2. Periodic (e.g., electronic interference, detector malfunction, software)
  3. Grayscale (contrast)
  4. Signal-to-noise ratio (SNR) <-
  5. Contrast-to-noise ratio (CNR)
  6. Spatial resolution
    1. Motion
    2. Geometric
    3. Receptor and detector
  7. Contrast resolution
  8. Shape distortion
  9. Magnification <- YAY!!
    1. Geometric
    2. Display <-

 

-Where will we fit it all?

  • Have we lost some of our focus?
    • Pathology: Do we really need 48 hours? Is it in our job description? (Vs. MD’s)
      • Our own standards of practice expressly forbid diagnosing images
  • CT: Has its own registry
    • Does it really belong in our “bread-and-butter” radiography curriculum
    • Why not offer it evenings as elective and/or CE course open to techs?

 

ADDITIONAL STUFF WE CAN GET RID OF

-Consider reducing:

Mechanics (velocity, acceleration, etc.)

Electrical circuit problems: parallel, series, mixed

Details on 3-phase generators and transformers

PACS management

Informatics in-depth

Other ideas?

 

A few things to know about LCD Monitors:

Based on Polarization of Light

Nematic Liquid Crystals

Electric charge controls amount of light escaping

 

CARE:

Pixels are built right into the screen

Poking or applying pressure can damage or destroy pixels

Most “dead” pixels occur from this type of abuse, so manufacturers do not warranty against them

 

IMPORTANT DISADVANTAGES:

  1. Limited viewing angle
  2. Inability to transmit true black  ↓app. contrast

Ambient lighting = crucial

  1. Brightness time-dependent

 

What really is a “pixel”?

To a computer expert, a pixel has no particular shape or dimensions, but is a point location or address which has been assigned a numerical value

For displayed medical images, however, we define a pixel as the smallest screen element which can represent all gray levels within the dynamic range of the imaging system

These elements do have both a shape and an area size

 

The Weakest Link:

Typically much more limited in both resolution and dynamic range (bit depth) than the digital image processing system of the computer

Why class 1 monitors for radiologist’s diagnosis station are so expensive

Why radiographers must be careful about judging images displayed on a class 2 workstation monitor

A poor quality monitor can effectively destroy the sharpness already achieved during image acquisition and processing

 

  1. LEADING OUT: TEACHING TO THE TEST, TEACHING TO CLINICAL APPLICATION

 

Terminology:

WE, the educators, must take the lead in this process, rather than leaving it to manufacturers,

our favorite physicist, or to the registry

 

-At present, discrepancies in terminology used by different manufacturers, and sometimes by

physicists, are overwhelming us –

  • Our job is to select good generic terms, look at what different authors are doing with these terms, make a thorough survey in each case, then decide which one is:
    • most universally applicable, and
    • has the greatest clarity in use

-Educators CANNOT follow the way clinical radiographers use the terms, OR manufacturers

  • In each case, we must choose the term that offers the most clarity for our students when they hear it used for the first time

-I believe: “If we teach our students to understand the concepts, they will not miss a registry question because a different term is used.

  • Example: I always called magnification “magnification,” not “size distortion”

 

Consistent Use of Terms: Brightness and Contrast Controls Vs. Window Level and Width

  • Strictly speaking,
    • High Window Level means a darker image
    • Window level = opposite of brightness
    • High Window Width means more gray scale
    • Window width = opposite of contrast

(Demo: CT)

 

– On Teaching to the Test:

Major considerations for ARRT in determining content:

Identified on the Task Analysis?

Are most programs teaching it?

Exam structured to measure entry-level skills:

-Does this mean we should be also?

 

Our major consideration: Is it clinically relevant?

Focused exclusively on what is “on the registry,” the whole educational process cam get locked in a circular pattern that never quite reaches a state-of-the-art curriculum

 

WE, the educators, should be the ones leading out by keeping our curriculum up to date

 

It’s not ARRT’s job to jostle us into the future by putting the items they think we should be teaching on the test before we’ve gotten into a regular pattern of teaching them, rather…

 

It’s our job to jostle the ARRT into adding appropriate items to the test because most of us are now teaching them

 

-We need more confidence …

– In ourselves as educators:

  • If we are making a reasonable effort to keep our curriculum up to date and relevant to the clinical setting, I don’t believe we need obsess over what is or is not “on the registry”

 

-In the registry:

  • We should recognize the professional care with which questions are vetted for the exam, the objective analysis that every question goes through, and the fact that questions can be worded in such a way that discrepancies between different authors are taken into account

 

  1. GIVING UP ON “CONTROLLING” FACTORS

 

Changing Roles for kVp and Technique

“Unlike screen-film imaging, image display in digital radiography is independent of (decoupled from) image acquisition” -AAPM Task Group 116, 2009 Report

Effects on the Latent Image Vs.the Displayed Image:

Everything we used to say about kVp and image contrast on a film can still be said about the subject contrast present in the remnant beam reaching the detector

In the digital age, we might consider this the latent image

By the time this information is converted into numerical data, digitized, equalized, rescaled, gradation processed, detail enhanced, noise-corrected, and formatted for display, the only image quality that has not been tampered with by the computer is distortion!

Factors Affecting Displayed Image Contrast: 1. kVp, 2. Rescaling, 3. LUT applied for gradation processing,

  1. Reduction of fog patterns by frequency processing, 5. Windowing

Greatest impact?

-If kVp still controls contrast, then what do LUT’s do?

-Contrast Comparisons

-The radiologist typically begins to window the image immediately after bringing it up, according to the anatomy and the pathology to be ruled out – This renders the question of kVp and initial displayed contrast not only minor, but practically irrelevant

We must finally give up on “controlling factors”

When it comes to the qualities of displayed digital image, the role of the specific mAs/kVp technique used is reduced to one “contributing factor” among several

Even though they may have been critical to making the digital image possible by ensuring adequate signal-to-noise ratio at the detector, stating that initial technique factors “control” any displayed image quality adds more confusion than it alleviates

 

Digital Processing FEATURES to cover

 

Windowing

-Brightness

-Gray Scale

Detail Processing

-Edge Enhancement

-Smoothing

Local Area Rescaling:

Ex: “Underpenetrated” [Correction for underexposure]

 

-Each can be over-applied or misapplied

 

Special Features:

Subtraction

Tomographic Artifact Suppression

Image Stitching

 

Over-use of Smoothing or Edge Enhancement:

  • Over-application of either can lead to loss of detail
  • If an image already has long gray scale or low contrast, applying smoothing in any degree can lead to loss of visibility for fine details (from low local contrast)
  • If an image already possesses high contrast, applying edge enhancement can cause excessive noise

 

High kVp and Patient Dose

Starting out with a high kVp approach:

Insures adequate signal penetrating through to the detector, which is critical, and

Always allows some decrease in mAs that results in a net savings in patient dose

-To quote Dr. Hughes, “Even a 10% reduction in patient dose is worth pursuing”

 

Original Data Set: Better too long than too short:

Starting with long gray scale = more shades = more information:

To increase contrast, computer simply rounds data, e.g. take every other step

Data set is all true information

 

Starting with short gray scale = fewer shades = less information

To lengthen gray scale, computer has to interpolate  (fill in the blanks)

Part of data set is fabricated information

:: High kVp

 

– Nothing is more satisfying to me than watching the light come on in someone’s eyes when they

suddenly grasp a concept – and they are always grateful. Radiography is so fun to learn

about. When we focus on memorizing facts rather than understanding concepts, our

students are compelled to do the same, and all the fun is sucked right out of learning.

 

 

 

  1. On making it PRACTICAL:

-Demonstrate APPLICATION as close to classroom instruction as possible:

-Either integrate clinical education with didactic instruction OR

-Use labs copiously

(Story: Cheyenne Lab Machine)

Labs can be fraught with pitfalls: Used (Donovan’s) labs, then made my own

Digital labs add more esoteric causes and unpredictable differences between manufacturers

 

  1. Lab Pitfalls:

-from personal experience:

-With digital equipment, you can expect that for almost any question, the results for some manufacturers will deviate from the general rule

-Is your particular lab equipment the exception?

-Always test your lab before class on the day it will be performed by your students

-For mottle comparisons, brightness and contrast must first be equalized, and magnification standardized (200%)

Lab Pitfalls in Demonstrating Contrast Δ:

NEVER use background density

Use two relatively homogeneous  density areas within the anatomy

Re-calculate 15% for every step change in kVp

 

  1. DIGITAL PROCESSING CONCEPTS – -Some of my favorite teaching models

Three Domains for Digital Processing:

  1. In the spatial domain, pixels are “sorted” by their location
  2. In the intensity domain, pixels are “sorted” by their pixel value (brightness, density)
  3. In the frequency domain, objects are “sorted” by their size

-Manufacturer chooses by comparing results for a particular objective

 

Sidebar: Three general approaches to sorting any digital image:

Preprocessing (Acquisition Processing): All corrections made to “raw” digital image data to compensate for physical flaws in image acquisition inherent in: 1) Elements and circuitry of image receptor, 2) Physical elements and circuitry of processor

Postprocessing: Refinements per anatomical procedure

 

Preprocessing:

  1. Field Uniformity Corrections
  2. Noise and Del Drop-out Corrections
  3. Image and Histogram Analysis
  4. Rescaling (Normalization)

Postprocessing:

  1. Gradation Processing (LUTs)
  2. Detail Processing
  3. Preparation for Display

IMAGE DISPLAYED

  1. Operator Adjustments
  2. Application of Special Features