Introduction It is presumptuous to think that the fields of qEEG and
neuro-feedback (NF) are advanced far enough to have a scientifically qEEG based protocol
that is a hard and fast rule. The field is scientific, but it is a scientific art at this
time to use a qEEG to design an intervention. It is entirely foolhardy to make rules for
this artistic task, so that is undoubtedly why I was approached for this task.
How does NF work?
An effective intervention into any system is to introduce feedback of the signal to be
changed into the system This allows the system to self regulate, like the heating or
cooling system in a house as a simple analogy. The models of how this works vary from
systems theory, to anatomical/structural models, learning theory, even non-linear dynamics
or "chaos theory".
The organic models have some measurable validity, with the observed expansion of
cortical areas dedicated to the structures utilized in tasks. Another observation
supporting this model is the dendritic density increase in the cortex utilized in learned
tasks. There are even reports recently of memory or 'long term potentiation' being
predicted by the electrophysiologic brain state measured at the time of the perception to
be recalled (Wagner et al., Science, August 1988)
The learning theory models have learning curve data to show the stages of the
acquisition of the skill of volitional control over the autonomic activity with NT. They
also predict the effect on efficacy of the sessions' scheduling to shorten the total
treatment times; massing the initial sessions and stretching out the later session's
intervals.
The systems theorists suggest the mere introduction of feedback may initiate self
regulation. This is seen with the audible heart beat normalizing the inter-beat interval,
without any instructions to the subject.
The most controversial systems theory being the "chaos" theoreticians, who
vary in opinion. Some chaos theoreticians in this field will say the anatomically specific
electrode site selection in NF is irrelevant (Brown et al., 1998 SSNR), with others saying
the site selection is critical for optimizing the training in NF (Thatcher, personal
communication 1998). Some theoretical discussions even state that the purturbation of the
system from feedback acts to chaotically restabilize the system independent of the need to
follow out a learning curve of further treatment, just expose the system to the chaos of
feedback and that is all that is needed (Schore, 1997).
The bottom line is they all are theories, or perspectives with testable hypotheses
associated with them. Many theories are not mutually exclusive of others, so the research
proceeds with various, sometimes mutiple perspectives. The complexity and diversity of the
models and opinions attests to the interdisciplinary nature of this field.
The most conservative critics would suggest placebo effect and experimenter effects as
the sole forces at work in NF. This too is a testable hypothesis, one which is having an
increasing difficulty explaining the observed results of the research found in peer
reviewed publications.
Why do a qEEG for NT?
There are many in the field of NT who do not perform qEEGs prior to designing a
clinical intervention. These people are currently practicing well within the standard of
practice for this rapidly evolving field.
Many within this group have standard protocols which are used on all clients, with
various alterations to respond to the client's reported experiences during the treatment.
I see the field of NF gradually moving more toward the use of qEEG, but it is not
required by any stretch of the imagination, much less a standard of practice. I am
sometimes misquoted as having said it is unethical to do NF without a qEEG. It may be less
than optimal, in my
estimation, but it is certainly not unethical.
The argument has been raised that the qEEG is only a way to bill the client additional
charges, draining the vital cash reserves of the clients, with no scientific evidence of a
benefit for the use of the qEEG. I agree there is an expense for a qEEG. To routinely
perform a qEEG without a demonstrable treatment benefit would be difficult to justify.
There is an increasing body of evidence that there is a positive treatment impact from
the use of a qEEG and the resultant customized NF intervention. The initial information
coming from those using the technique "feeling" they got some clinical utility
from qEEG data. The more persuasive evidence to date is a retrospective evaluation of
outcomes in a single practice.
The retrospective research compared 3 years of NF data using a commonly used standard
treatment approach to 2 years using the qEEG based customized intervention. A gross
summarizing of the paper shows a doubling of the consevatively estimated clinical success,
from 30 to 60%. Further, the total treatment benefit (both 'some benefit' and 'full'
benefit groups added together) increased from the commonly previously reported 80%
increased to 90% now receiving perceived benefit (C. Wright et al., SSNR, Austin 1998).
The cost effectiveness is seen easily if there are a few sessions spent "getting
it right" using the clinical guesses to select sites. It only takes a few wasted
sessions, not to mention possible adverse reactions, to pay for the proper selection using
the qEEG.
I believe the strongest argument for the use of qEEG stems from the reported incidence
of non-convulsive frontal and temporal lobe epilepsy comorbid with diagnosis of ADD/ADHD.
When I saw 10% quoted in the literature, I was shocked and had some doubts about the
reliability of the observation. Following nearly 3 years doing the screening for one
ADD/ADHD practice, I saw a similar percentage of undiagnosed or "occult
epilepsy". I now have more faith in the figure.
To use a standard ADD/ADHD intervention with an undiagnosed epileptic may be
problematic. The lack of awareness being no excuse (read 'defense') if legalities are
invoked. The qEEG has a clinical EEG read during its evaluation, allowing for the proper
referal or diagnosis of epilepsy (or any other occult condition such as tumor, metabolic
or toxic encephalopathy or early dementia).
How do the maps tell you where to intervene?
I once heard qEEG referred to as "electro-phrenology", a term that conjures
up images of ancient times and archaic beliefs about brain function. I sort of like the
term, as I think the term speaks to the potential to make simplistic assumptions about
intervention, based on colored map "hot" spots, the 'bumps' of
electro-phrenology.
QEEGers without an appropriately sophisticated model of how the brain works will be
tempted to stick the intervening electrodes on areas that 'light up' with some color in a
map. The area is likely to be an artifact, a normal finding, a normal variant or even the
proper area for intervention. It may also be an effect of a distant cause or change in
brain regulation.
The time consuming study of the brain's function, EEG and the quantitative analysis
techniques, including artifacts is needed to understand the colorful and informative
mappings, tables of values and database comparisons. The careful study of the database
selected is also needed to understand its strengths and weaknesses (Thatcher, 1998).
One of the earliest NF clinicians to use the qEEG to intervene in the 1970's was
Pourier, a Canadian clinician/researcher. He used the Fourier analysis derived compressed
spectral array (CSA) to select the 'deficient' bands and those in 'excess', setting his
protocol to act
like a bull dozer, chopping off peaks and filling in valleys. His clinical judgements
were based on experience, not database comparisons, but he did report positive results
clinically.
Hopefully qEEG based NF has advanced since these early days of simplistic assumptions
and electro-phrenology.
Later in this chapter some qEEG patterns will be discussed and the NF interventions
indicated by these patterns. For now, the study of the digital manipulations of the data
needs to be put in place.
Artifacting
The qEEG represents the statistical manipulation of the raw EEG, so an understanding of
these manipulations should precede any discussion of the qEEGs clinical indications for
protocols. Without such knowledge any given finding may be misinterpreted.
Following the careful recording of the EEG, the quantitative analysis is begun with the
sampling of the data to be used in the analysis by the Fourier transform. The Fourier
analysis assumes there are no transients (epileptic discharges, episodic voltage changes
etc.) or state changes
(light sleep, drug effect, mental task, etc.), so these must be avoided when selecting
data for analysis in qEEG for eyes closed resting database comparison. There are some eyes
open and task databases available more recently (Hudspeth, Sterman, Duffy etc.)
Transients are an event with a rapid onset and ending, with an increase in amplitude of
greater than 50% over the ongoing activity. Epileptiform activity is one common example of
this phenomenon. The only time transient discharges may validly be included is when dipole
localization or "mapping" of the sources of this activity is the intent, and in
this circumstance only significant discharges should be sampled, with the ongoing
background treated as the state change and eliminated.
Less rigorous analysis selection standards exist for data not intended for database
comparison, such as reading or other task related data. Usually these task data will only
be used for gross comparison to the more carefully collected steady state data, looking
for gross changes in brain function. When there is an intent to compare to an eyes closed
normative database, transients will increased the variability of the dataset, but will be
averaged out in the mapping unless persistent or very prominent. These transients will
simultaneously alter the dataset's standard deviation from the norm.
State changes include typically sleep stages collected for an eyes closed awake
database comparison. Stage 1 sleep is a subtle drowsy state, where people recorded will
usually deny being drowsy when alerted. The alpha is beginning to wax and wane, with
subtle increases
in theta and occasionally slow rolling eye movements and a decreased EMG tonus. This is
just prior to stage 2, where an object being held will slip from grasp, alerting the
client, with most realizing the drowsing is present in stage 2. Many people doing a
repetitive task or automatic
pilot type task will be in stage 1 without being aware of time passing. This is a
dangerous situation if response to change is needed, as there are delays in reaction time
associated with this state.
The problem with stage 1 (drowsing) being added to a dataset is that it is a mixing of
states, violating the FFT assumptions and making database comparison validity more than
merely suspect. The inclusion invalidates comparison.
The task of artifacting is to sample enough data to provide reliable maps while
maintaining the validity of the sampling, not taking state changes or transients. The
amount of data acquired should provide highly repeatable or reliable mapping. The time
required to achieve these results is different for each frequency.
Beta becomes reliable in the first 30 to 45 seconds, with alpha following at 60-90
seconds. The intermittent nature of theta makes it the least easily established
reliability, with 120-180 seconds required.
Delta is reliable at about 120 seconds. Our lab tries to get 120 seconds of data when
possible.
Reliability may be established another way, with split-half replication. This actually
looks at the lability within the sampled data by looking for the invariant similarity of
the repeatable parts of the two data sets, with the more variable parts looked at with
less confidence.
The time span or length of the epochs selected determines the sensitivity to the slower
frequencies. The Fourier transform has to have a completed waveform in the data epoch
sampling for the frequency to be detected and quantified. A 1/2 Hz low frequency
sensitivity is thus achieved only when 2 second epoch lengths or longer are sampled. There
is a downside to too long an epoch. It increases the likelihood of including state changes
and transients, though when clean state stable data exists in long episodes, it should be
sampled.
Some equipment has preset epoch lengths and interactions between the sampling rates and
the epoch length, which is problematic in sampling data flexibly. The epochs in some
equipment are not able to be adjusted in time to 'slide' past artifacts, making the data
artifacting such that
it is difficult to sample the clean data in a record with intermittents like eye
movement or other transients. Careful selection of the equipment should precede entry into
this field.
The artifacting is mostly concerned with eliminating the more common artifacts of eye
movement and EMG, as well as movement or electrode artifacts. This cleaning of the data is
part of the art of doing good qEEGs, though the science of adequate sampling and the
assumptions of
the Fourier must be kept in mind.
New artifacts introduced by the digital processing. The digital recording and
processing of the raw analog waveform of the EEG should be understood in technical detail
to properly interpret the resultant maps and numerical tables of findings.
The EEG is digitally converted from the analog data by an analog-to-digital or A-D
converter and a resultant digital dataset is derived. The digitizing sampling rate and the
bit length of the computer data will determine the resolution of the resultant image and
tabular datasets. The faster the sampling rate, the faster the frequency that can be
resolved, with a minimal sampling frequency defined by the Nyquist principle as 2 times
the frequency being resolved.
Proper reproduction of the EEG for visual perspective requires a more conservative
sampling rate than the 2:1 Nyquist ratio. This greater than 2:1 ratio must be set by the
individual's preferences. Few would choose less than 128 samples per second, most would
prefer 500/second to
1000/second. (The manufacturer's with set buffer sizes for the epochs, like the
Lexicor, will however need to look at the impact in loss of lower frequency sensitivity.
For this situation 128 to 256 is the highest reasonable choice.)
The channel sampling should be simultaneous, to avoid remontaging error or slew. If not
simultaneous, a faster sampling rates will reduce this error (as will burst mode sampling)
reducing the phase or time base error to a minimum.
The bit length of the computer "word" processed by the CPU effects the
amplitude resolution in qEEG, irrespective the sampling rate. The longer the bit length,
the better the resolution ( a bit length of 12 is acceptable, but 16 is preferred. Older
units will have 8 bit processing and are not fully adequate without further scaling
adjustments).
The epochs selected during artifacting will all have an abrupt start and stop, without
a zero voltage point at each end of all the epochs. This sudden voltage is seen by the FFT
(fast Fourier transform) of the computer analysis as a square wave at the start and stop
of each epoch.
The resultant output of the Fourier is that all frequencies were present at that point
to "reconstruct" the square wave. This is termed "leakage artifact",
or "Gibb's artifact".
The result of the Gibb's artifact is that if a 10 Hz waveform was put into the FFT, a
spectral plot of the output would have a rise of the baseline where all frequencies were
used to reconstruct these abrupt starts and stops of the epochs. There would be a spectral
peaking at 10 Hz, with a tapered response and a broadened base to the frequency plot.
To correct for the leakage or Gibb's artifact, a "windowing" filter is used.
The result of this windowing is the return to the baseline of the generally elevated
spectral plot mentioned previously. There is a residual broadening of the idealized
spectral peak at 10 Hz. This residual artifact of the broadening of the spectrum following
the windowing is "smearing" artifact.
The windowing used in qEEG is usually a Hanning window. This filter slowly ramps up the
start and ramps down at the end, to avoid the apparent square wave the FFT sees. Other
windowing techniques are triangular, Blackman, Hamming, Meyer's , with the lack of
windowing occasionally referred to as a "rectangular window". A full discussion
of these details, contrasting the different styles of windowing is outside the scope of
this chapter.
"Aliasing" is an artifact caused by a frequency source near the sampling rate
(and above the nyquist sampling rate) so that a beat frequency is created as an alias of
the source frequency/sampling rate interaction. Aliasing filters are used to control for
this artifact in all modern devices.
Database issues
The datasets derived from the artifacted EEG are the starting point for the comparison
of these data points to the databases used in qEEG analysis.
The databases are the means and standard deviations used to establish the significance
probabilities for the observed measurements. These will be represented as Z-Scores,
roughly these may be seen as the standard deviations from the normative database of the
data points in the dataset. They are calculated as the patient mean minus the database
mean, divided by the standard deviation of the population.
Z-scores are reported in tabular form and may be mapped. Significance probability
mapping ( a term derived by Frank Duffy, M.D. of Harvard's Children's Hospital) allows the
interpreting individual to view the spatial distribution and extent of deviation in an
easily discernible
display. 1.96 Z-score deviation is equal to 2 standard deviations, and 3.08 Z-score
deviation is three standard deviations.
Normative databases are constructed with highly screened normal individuals with an age
range establishing the limits of the database. The database is constructed controlling for
socio-economic and other demographic influences. Importantly, the databases must be
established with different norms for male and female, to account for the significantly
different neurophysiologic structures and rates of development of the male and female
brains.
The database used should be selected to match the end use and population to be seen.
The age range may be critical for those seeing children or the elderly. For others, the
presence of multivariate stepwise discriminants used in determining the likelihood of
membership in one of
two or more clinical groupings will be critical in the selection. For others, the
presence of an eyes open database for use in clinical eyes open work, or the task related
data will be critical.
The database from E.Roy John, Ph.D. of NYU's Brain Reasearch Laboratory contains more
than univariate measures, with multivariate parametric statistical evaluations of normal
and many clinical subgroups. This has stepwise discriminant analysis associated with it.
Discriminants for
head trauma are also available from Thatcher. The Duffy database is used, though is not
is commercial distribution now that the Nicholet BEAM instrumentation does not carry it.
Sterman has an age limited performance based database available, with new databases such
as Hudspeth's coming into availability recently as well.
In using the discriminants, if available in a database, care should be exercised to
assure the applicability of the discriminant to the client being evaluated. The client
must fit the conditions that were set for the construction of the discriminant.
If a discriminant is set up to decide the likelihood of being a member of group A or
group B, a member of another group, C, will be classified as A or B, not properly
identified as another type, type C. This weakness of discriminants must be controlled in
the selection and use of
the discriminant, not after it has been performed.
Displays
The displays in the qEEG report will be presented as a progressive analysis of the
data. The displays may be sorted into those that are closer to and farther from the raw
EEG, being farther from the EEG as more statistical manipulations are performed. The
farther from the raw
data one goes, the easier it is to mistakenly interpret artifacts as real or
misinterpret relational data.
To avoid these easy and eventually certain mistakes, the full visual interpretation of
the EEG must precede any review of the analyzed data. Only then should this be followed by
a review of the raw amplitude mapping, in as detailed a frequency display as possible.
This may be
followed by broad band analysis. The amplitude mapping should be followed by the power
and then the relative power analysis. Only after this step-wise evaluation this should
absolute or relative power Z score or other database comparisons be done.
Following the spectral evaluation, the statistically extracted measures of symmetry,
coherence and phase are evaluated, without being as likely to make a mistake.
The presence of artifact should be expected in clinical work. In clinical work the
luxury of prolonged recordings and rejection from a study due to artifacts is not present
as it is in academic situations.
The progressive step by step evaluation will control for these situations as best they
may be.
To err may be human, but it should be controlled and accounted for with methodologic
routine to the extent humanly possible.
Regulatory and certification issues
Entering into the field of qEEG, one should expect to have a huge continuing education
opportunity and responsibility ahead. I would advise the initial study of EEG to anyone
entering this field, as it is the basis of the technique. Studying EEG allows for a solid
foundation
for the later study of qEEG methods and clinical applications.
Some certification in the field of EEG, or basic course completion in EEG should
precede any entry into qEEG. The field of qEEG does not have any legislated requirements
in California, though certification exists for various levels of practice in this area.
Certification or licensure
is regulated on a state-by-state basis.
The medical professional should proceed to board certification in EEG, then the qEEG
specialization. This has been available from both the American Board of
Electroencephalography and Neurophysiology (ABEN, AMEEGA's associated testing board) and
American Academy of Neurologies (AAN) affiliated testing bodies for EEG, but only ABEN for
the quantitative EEG specialization.
Psychologists have had the American Psychiatric Electrophysiology Association (APEA)
for education in the field, however, now the APEA and AMEEGA merged in 1998 to form the
EEG and Clinical Neurosciences Society (ECNS). This combined body will offer the board
certification previously available to medical doctors through an associated testing board,
ABEN.
I must disclose an association with technologist educational and testing groups, the
AAQEEG and ABCQEEG boards, offering training seminars in EEG and qEEG and certification as
a QEEGT. This certification is not required to practice anywhere, but does show a
demonstrated competence in the technique of qEEG. Testing should not be attempted without
substantial investment in studying the details of the technique, as the low pass rate
would attest.
There is also the issue of regulatory approval for equipment, both in EEG/qEEG and in
NT. The FDA has regulatory authority over NF and EEG devices that are sold or marketed,
including NF devices, qEEG hardware and software... even databases. The FDA does not
approve devices, it
registers them, after they have gone through an arduous regulatory review for the
validity of the claims and the safety and efficacy of the device.
The FDA registration has been cleared by many manufacturers, though there are some who
continue to try flying beneath the regulatory 'radar'. The equipment purchased by an end
user that is not federally approved may be 'taken' without compensation by the FDA. As you
can see,
there is a benefit to looking for the FDA 510K number befor purchasing any hardware or
software.
Individuals interested in workshops and conventions in the field of NF and qEEG should
look at AAPB and SSNR as well as Rob Kall's conferences and workshops as solid resources
in this area. Regional and local organizations and interest groups are often available as
well.
Clinical applications of qEEG
The qEEG is used by those currently in a professional practice for the following
clinical applications: evaluating effects of medications and predicting medication
response, evaluating head traumas, assessment of cognitive and psychiatric changes, in NF
and in peak performance
assessment and training as well as others.
The use of qEEG 'dipole location' in surgical candidate assessment before brain surgery
in epilepsy is one of it's most solidly accepted areas, with AAN approval. In other areas
of qEEG application there are various levels of support or lack thereof from the various
professional groups.
In the hands of those familiar with the field, qEEG can be used well in all these
areas. In the hands of someone unfamiliar with the details of the technique, it can be a
waste of time, a source of distraction or difficulty or even a liability.
Some patterns seen in the qEEG and their indicated interventions:
Diffuse slowing, with slower alpha
The ascending reticular activating system stimulates the diffuse thalamic projection
system and sets the general arousal level of the brain. With an increase in the CNS
arousal level, there is an increase in the mean frequency of alpha and a decreased
slowing. With decreases in arousal there is a slowing of the alpha, as well as eventually
an increase in diffusely distributed slowing ( a mixture of diffuse lower voltage delta
and theta, usually with a weak vertex prominence in linked
ear montages).
When this diffuse slowing with slower alpha is seen, a vertex or central sensory-motor
strip beta training will slightly speed up the alpha and decrease the slowing seen. A
frontal beta minima seen in the data may respond to a more anterior placement for the beta
training.
This increased beta training should correspond to a brain stem shift in RAS activation
with increased norepinephrine level's stimulating effect and results in increasing
vigilance.
When this slowing and alpha pattern is seen, but with alpha intruding frontally
(occasionally with less of the slowing) , the protocol should include some parietal
"high alpha", defined as 11-16 Hz in the classical EEG literature, but usually
in NT, this is from 10 or 11 to 14 Hz.
This parietal high alpha training shifts the alpha mean frequency higher, decreasing
the diffuse projection system's frontal alpha and increasing the posteriorly distributed
specific projection system alpha (though at a slightly faster frequency distribution).
Often it is the slower alpha frequencies that are intruding frontally, and they are
reduced with this shift.
Focal slowing
Focal slowing that is not an artifact (such as a pulse, electrode, electrodermal, eye
movement or other artifactual source of slowing) should be evaluated by an
electroencephalographer or neurologist. The focal slowing may be from a tumor, ischemia,
stroke, trauma, inflamation or other medical condition. The etiology should be identified
prior to any intervention or consideration of NT.
Should the etiology be known, generally the reduction of the slowing and enhancement of
faster activity improves the brain function following NT. Slowing is reported in specific
learning disabilities and sensory processing problems ( Chabot et al., SSNR Aspen,1997)
Faster alpha variants, not low voltage
The alpha frequencies may be faster than usual, sometimes corresponding with
anxiousness or hypervigilance. These situations can present with complaints about
attentional problems, with the hypervigilance acting as a source of increased distraction,
but with the process differing from
more usual ADD/ADHD presentations.
The faster alpha often has increased EMG associated with it. Patients with these
findings seem to respond with paradoxical increased anxiety if EMG relaxation is tried
without first addressing the hypervigilance of the faster alpha. The 'letting down of the
guard' is anxiety producing.
Training the slower end of the normal alpha band parietally seems to have a strong
positive effect on these individuals, with SMR training used by some therapists. If the
EMG remains, the subsequent relaxation therapy seems to work without the increased anxiety
following the NF
intervention.
Frontal lobe disturbances
The frontal lobes inhibit distractions and inappropriate impulsively motivated
behavior, control affective mood states and attentional states. The frontal lobes also set
the motor strips tone via inhibitory control loops involving subcortical structures. The
frontal lobe has general regulatory control over the entire rest of the brain.
In attentional and affective disorders as well as motor dyscontrol such as hyperkinetic
disturbances the locus of the dysfunction is commonly the frontal lobe.
Work recently done on ADD/ADHD and affective disorders shows a variety of frontal
disturbances seen with the qEEG. These varieties include slowing in theta, alpha intrusion
and even excess beta (Chabot et al, 1998 SSNR, Austin). Non- systematically replicated
research showed the qEEG to predict the response of the patient to medications. The theta
types responded to stimulants, with the frontal alpha, especially when the mean frequency
of alpha was slowed, responding to amphetamines.
Other researchers show some of the alpha types to respond to SSRI type antidepressants
(such as the OCD responsive type in work by L. Prichep and the depressives with frontal
alpha in work by S. Suffin and Emory).
In NT, the qEEG may be used to adjust the intervention, with frontal theta responding
to beta protocols with suppression of the excessive theta. The alpha frontal types respond
less well to frontal alpha downtraining than to posterior high alpha training with
concurrent frontal beta training. The frontal beta type seeming to respond to normal
frequency alpha training posteriorly and frontal beta suppression if the beta is still
excessive (with any slowing noted suppressed as
well). The frontal beta excess type is often a difficult patient in my experience.
Frontal beta minima can be seen in frontal disturbances and seem to respond well the NF
intervention. Areas commonly seen are F7 in attentional problems, F8 in impulsivity and F3
or Fp1 in depression. The locus or even presence of a minima is difficult to predict
behaviorally,
as other disturbances of function may yield the same behaviors.
Right frontal training and frontal symmetry
The commonly held belief that in NF the right frontal lobe should be avoided needs to
be explored and understood before discarding it.
Frontal alpha and beta interhemispheric ratios seem to correspond well with the
perceptual style of the subject. Righ hemispheric dominant (more beta and less alpha than
the left) subjects have a "glass is half empty" perception and a lower mood
state, or more depression.
NF with the frontal lobes needs to keep the dominance on the left, or to establish such
a dominance to avoid deteriorating mood states and perceptual styles in the client.
This has led some to avoid the right frontal training, or frontal training in general.
This would be a drastically limiting elimination of potentially efficacious NF
intervention on brain function, given the importance in brain pathophysiology and
functional regulation of the frontal lobes.
If care is taken to measure and assure the desired lateral symmetry, it is my
experience that the frontal lobe training on either side may be done without significant
difficulty. Without this knowledge and care applied to protocol considerations, it could
be a large source of client
dissatisfaction.
Spindling excessive beta
This pattern has been reported to be associated with 'cortical irritability', viral or
toxic encephalopathies and in epilepsy. It has a classically defined higher voltage beta
occasionally even exceeding 20 microvolts. This abnormal beta is seen in waxing and waning
spindles
over the effected cortex. This pattern is seen in less than 10% of the ADD/ADHD and
affective disordered population, but when seen, it is an important finding.
I have seen this excessive spindling beta in areas associated with pre-epileptic auras.
In one case it was seen occipitally during visual auras and in another fronto-temporally
with auras effecting the client more subtly as a smell or even a remembrance.
This pattern responds very badly to any beta training, exacerbating the symptom
complex. Beta training is strongly contraindicated.
Beta suppression directly in the area of concern has shown good clinical response. The
band of frequencies to be suppressed should be selected based on individual profiles, not
by standard bands. I have seen broader bands like14-22 Hz or bands as narrow as 14-16 Hz
in excess, with higher 20-30 Hz beta occasionaly involved as well, with many variations.
The customizing of these interventions would be very difficult, if not impossible
without the qEEG to provide location, distribution and frequency range information to the
NF practitioner.
Areas without significant activity seen in any EEG band
The EEG can be seen with areas of decreased amplitude, not just a beta minima, an area
of general amplitude minima. This phenomenon is seen well in mapping and has been reported
in areas of cortical dysfunction.
The EEG requires the generation of alternating currents for any voltages to be measured
in the EEG. This requires the activity of neurons in an area with the increased blood flow
and glucose metabolism associated with these cellular processes. The flowing of blood in
the brain is regulated by the concentration of bicarbonate ion measured as PCO2, a
metabolic by-product of the burning of glucose and the generation of energy within the
Krebbs cycle. The ADP/ATP cycle within the mitochondria is the site of this
electro-chemical interplay.
In careful work done by Pribram, the time frame for these events has been studied, with
the slower DC activity preceding the cellular action potential. This DC system has been
used since the 1970's in Europe in NF , showing that the shift to electro-positivity can
even stop an
epileptiform discharge from occurring (N. Birbaumer).
This phenomenon also can be seen in qEEG as an area of decreased voltages in all bands,
progressing from beta through alpha to the slower frequencies. The area is effectively
shut down and is not functioning. When the brain's electropositivity increases enough, the
AC activity
seen in the EEG is inhibited.
These areas have been observed frontally in attentional and affective disorders (
Gunkelman, SSNR1997), and are reported in observations of individuals who have been brain
washed and have given the locus of control over to others (personal communication with
Brownback, Mason and associates).
Training these frequency "dead areas" is something newer in the field, but it
seems to respond to beta training, suppressing any slower frequencies that may be still
present. Beta is correlated highly with PET measurements of metabolic activity (I.A. Cook,
A. Leuchter et al, 1998 UCLA)
Generally low magnitudes
The occurance or a low voltage EEG is considered a normal variant when it is a low
voltage fast EEG. When the low voltages appear slow however, it is a diffuse and
non-specific abnormality. The difference between the two patterns is somewhat more
qualitative than quantitative. The
morphologic presentation differs more significantly than the magnitude differences in
the quantitative analysis.
The magnitudes are all low, and in relative terms the power will look slowed in both
cases, though the faster morphologic pattern is a normal variant. When a low voltage slow
pattern is seen and is confirmed not to be drowsing, it should be evaluated for metabolic,
toxic or other
diffuffuse encephalopathies such as degenerative or post hypoxic etiologies.
The low voltage slow type is reported in dementias as an early EEG change. This seems
to respond to high alpha training from 10 or 11 to 14 Hz. This is the same EEG effect as
nootropic medications (smart drugs) will provide.
The low voltage fast type usually corresponds with anxious, nervous and hypervigilant
individuals. Though not pathognomonic, it is commonly seen in alcoholism and alcohol free
members of families of alcoholics with a strong family history.
Interestingly, in research on this pattern, it is shown to respond to alcohol by
suddenly having alpha that is well formed. The alpha will slow and rhythmic slower
activity will increase if higher doses are given. The state is reported in euphoric terms
by the research subject. This euphoria is also reported associated with alpha induced by
opiates.
The low voltage fast pattern responds well to alpha training with a normal alpha
distribution of 8-12 or 9-11 Hz. The learning curve for this is well established, having a
fifth order curve fit. The phases of the experience are well defined by the curve.
There is an initial increase in alpha due to the habituation to the clinical setting,
with subsequent decreases in alpha during the active attempts at controling the alpha.
These phases are followed by the release of the active attempts and a return to the
habituated level. This is followed by passive volitional attempts and the eventual
acquisition of voluntary control seen as the exponential increase at the end of training.
The subsequent alpha/theta training is commonly used by neurotherapists in these cases
when there is a concurrent addiction or intense life stress or trauma by history.
Temporal lobe alpha
When alpha is seen in the temporal lobe, it can be from a variety of causes, indicating
that a more complex NF protocol response may be needed.
The alpha from old head trauma is usually a faster alpha variant, adjusted for the
individuals alpha 'tuning'. This is seen over a year or 2 from the time of the trauma,
after the acute healing and swelling have long dissipated. It replaces the slowing which
may be seen initially.
Temporal alpha may also be an effect seen in response to a decrease in ipsilateral
frontal lobe activity. The decrease in uncinate fasciculus or inferior longitudinal
fasciculus stimulation from the frontal lobe allows the temporal lobe to be idle. This
usually will be seen with one of the frontal lobe patterns discussed earlier.
The temporal idling may be cleared up with the direct frontal work discussed earlier,
but may require lower band beta training directly on the temporal site.
Strong alpha at T5 or T6 can contaminate the ear references, yielding a false image of
frontal alpha in the qEEG. The ears, having alpha present and the frontal lobes without
alpha are compared in the differential amplifier. The amplifier will show alpha in the
frontal channel falsely, which has to be controlled for by using a variety of montages in
reviewing the data. The frontal alpha will not be seen with sequential or non-contaminated
reference montages, such as Cz or in more sophisticated equipment the 'common average' or
even the 'Hjorth' montage may be used to give a more pure look at the data.
The temporal lobes seem sensitive to excessively fast beta training, with 14 Hz
training, 12-15 Hz, 14-16 Hz or other lower band beta used more commonly than a higher
frequency intervention due to this sensitivity. In my experience, these bands seem to be
best adjusted
based on clinical response, not to any obvious spectral loss in the CSA or amplitude
mapping.
The post traumatic faster alpha seems to respond to coherence training, seemingly
reconnecting the functional relationships. This requires the use of qEEG coherence
measurements, as without this, the training sites and bands would not be evaluated or
selectable based on any objective
criteria.
A note on coherence and phase
The cortex is full of neural connections, all electrically active, though not all seen
with the EEG. The EEG is measuring summations of radially generated action potentials from
pyramidal cells, not the laterally oriented cortical-cortical tracts.
The raw EEG is blind to nearly 2/3 of the electrical activity of the cortex. Much of
which is seen by the MEG, a measure of the "magnetic" activity of the brain
(actually lateral current flow of intracellular activity). The MEG is however blind to the
extracellular potentials arranged radially which comprise the EEG.
The longitudinal myelinated fiber tracts are the high speed web of the cortex, not the
organized subcortical radiations of the thalamo-cortical systems or other
subcortical-cortical projection pathways. This network of local arcuate, longitudinal,
fronto-temporal (uncinate) and
interhemispheric collosal tracts are invisible to the raw EEG.
The activity of this invisible network, however, can be inferred from the coherence and
phase relationships between areas "hooked up" by the network. This spatial
distribution, "connectivity" reflection of the subsurface activity is obviously
quite complex. The covariance of power
at two sites (coherence) or their covariance in time (phase) give measure within the
EEG to activities within this invisible network.
The view of the raw coherence and phase data yields little to the non-expert. When
compared to an age and sex matched normative database, displayed as a Z-score, the
magnitude and direction of variance, including the significance of the variance become
much more evident.
There are two types of phase relationships seen in qEEG, conducted phase differences
and propogated phase differences. The propogated phase is seen where a focal phase
reversal indicates the source of an EEG phenomenon. The conductive phase is simply the
time delay due to propogation along neural pathways.
The phase measurement reflects the correlation of covariance in time of activity at two
sites. This temporal relationship can be slowed by damage to tracts due to demyelinating,
structural or toxic/metabolic influences. The phase relationship may be faster with
increased nerve conduction velocity, or by volume conduction through fluids and as a field
effect.
The simultaneous projection to various cortical locations of synchronized volleys from
thalamo-cortical radiations seen during thalamic activity paced by the ventral-medial or
reticular nucleus is a "highly connected" state, with high phase synchrony. This
unique synchronized state is predictive of meditative expertise in Zen meditators (Gevins
et al.). It involves progressively generalizing increased phase synchrony in alpha,
followed by increases in theta
synchrony (possibly representing slowed alpha).
Similar findings have been seen in Yogic meditators, though with differences in
habituation to repetitive sensory stimulation. The Yogic meditators habituated faster than
normal and the Zen meditators failed to habituate. Interestingly, this reflects the
philosophical views of the two meditative techniques. The Zen meditation aspires to
novelty of experience while the Yogic traditions emphasize the imaginary nature of the
external reality.
The phase relationship is measured as difference in the degree of arc of tangents of
two waveforms, measured at a point in time. This is reported as degrees of phase shift
from 1-180 degrees, or time synchrony of the waveforms, reported in milliseconds.
Often confused with phase is the EEG measure of coherence. The confusion comes from the
use of the term "coherence" as an adjective descriptor for phase. This misuse of
a technical term as a descriptor in the same field of interest is problematic. The term
"phase coherence" should be
eliminated and replaced by "phase synchrony" or, most properly, by specifying
a phase relationship from 0-180 degrees of phase shift or by giving the Z-score deviation
from normal. Please retain the use of coherence for it's technically proper role within
this field.
Coherence is the cross correlation of the power (or amplitude) of activity at two
sites. Sites that covary highly are presumably processing related cortical/subcortical
volleys or have a high
"connectivity". This is reported as a value from 1 to 0 (on some older
equipment, as a value from 100% to 0% coherence).
The cortical-cortical long fiber connections compete with the shorter connections. As
Bob Thatcher says, "The close siblings can speak to each other, but not to their
distant cousin at the same time.". This is called the two compartmental model of
coherence.
There is a third compartment I believe, the subcortical-cortical compartment, which
would explain the observation of high coherence locally having decreased connections
locally. The coherence is from the subcortical source being connected to both sites, not
the cortical-cortical compartment's connectivity.
Coherence is most easily viewed as morphological similarity, with correction for
absolute magnitude and is irrespective of the time synchronization. When two waveforms are
shifted in time until maximal coherence values are attained, the time base shift is the
phase delay of
the waveforms.
Structural damage produces reliable, consistent patterns of change in phase and
coherence patterning in the qEEG, though functional influences account for variable
findings. This variance makes interpretation of the data subject to having to rely only on
strong
patterns of variance, not isolated findings.
The proper conservative interpretation of these data must also be viewed in the context
of the entire constellation of findings from the rest of the patients EEG, qEEG and
clinical presentation.
When training changes in coherence or phase using neurofeedback, over-training
(shifting beyond a normal relationship range) needs to be avoided. This requires merely
setting a proper normalized goal or value as a training target.
There is currently what I believe to be an inappropriate practice in coherence NT. This
is the use if a single channel EEG machine used sequentially (old term bipolar) with the
assumption that this will feed back coherence. This assumption suggests that increased
highly coherent activity cancels, thus decreasing the amplitude of the channel. Thus
training amplitude could train coherence.
This is a grossly false model of coherence feedback. This model assumes the phase of
the activity to be synchronous. It also assumes that the synchronous activity has
amplitude equivalence. In reality the coherence calculation equally weights amplitudes,
correcting for
asymmetries, the amplifier does not. The calculation of coherence is not time locked,
measuring covariance independent of the time locked nature of the amplifiers response.
I believe we are not yet in the presence of enough information about the clinical
impacts and needed protocol controls to make full use of the clinical application of these
types of training. The clinical application of these techniques is highly fruitful ground,
needing systematic clinically valid research and protocol development.
More fully conclusive research is needed based on the ongoing application of this
training before rules for intervention are presented.
I have nothing against the careful clinical use of coherence or phase training. When
the more "traditional" neurofeedback applications fail, these phase and
coherence based interventions should be empirically tried. With more trials using the
proper measures, design and controls,
advances will be made.
The departing caveat
Now that I have completed this codification of some of the interventions seen currently
in qEEG based NT, I can say without question that this will be a source of future
embarrassment. To capture a snapshot during a field's rapid development stage is a
guaranteed red face in the
future. If you don't believe this, look at your own childhood photographs!
So, please be gentle with your ridicule, and don't mistake this art to be a science ...
not just yet.
- Suggested readings and study areas:
- Books:
- Fundamentals of EEG Technology
- Volume 1, Basic Concepts and Methods
- Raven press, by Tyner and Knott 1983
- Electroencephalography, Basic principles, Clinical Applications and
- Related fields, Ernst Neidermeyer and Lopez de Silva
- Williams and Wilkins, 1993 ISBN# 0-683-06511-4
- ( "the unabridged encyclopedic dictionary of EEG", a true life long
- reference, with good chapters on qEEG)
- Journals:
- Journal of NT (SNR, see SNR web site below )
- Clinical Electroencephalography (AMEEGA)
- 850 Elm Grove Road
- Elm Grove, WI 53122
- Electroencephalography and Clinical Neurophysiology (IFCN)
- Call for subscriptions: 708-216-3405
- Web Sites:
- neuronetinc.com
- snr-jnt.org
aapb.com |