Skip to main content
Social Sci LibreTexts

9.9: General Recommendations

  • Page ID
    137626
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    In this section, I summarize my recommendations for ICA-based artifact correction. These recommendations reflect my assessment of the current state of the art. It is difficult to determine the optimal approach to ICA analytically or with simulations, so these recommendations are mainly based on a combination of informal experience (my own and others’) and empirical studies (which are often unsatisfying because ground truth is not usually known). As a result, these recommendations may change as more information accumulates, and other researchers may reasonably disagree with some of them.

    Also, these recommendations are based largely on studies of compliant adult participants, and some changes may be necessary for other populations. So, use them as a starting point for designing your own artifact correction approach, not as a recipe to be followed blindly. And make sure that your approach is designed to achieve the fundamental goal of obtaining an accurate answer to the scientific question your study is designed to answer (which typically involves reducing noise and avoiding confounds from artifactual voltages and from changes in the sensory input).

    Now that I’m done with the caveats, here are my recommendations:

    • The correction should ordinarily be carried out on continuous data. If you have very long intertrial intervals, you can use epoched data, but the epochs must be at least 3 seconds long (e.g., -1000 to +2000 ms). No baseline correction should be applied (Groppe et al., 2009). There should be no overlap between consecutive epochs. Also, the epoching should be performed after the high-pass filtering step described later.
    • The data should be referenced, and all of the channels included in the ICA decomposition should share the same reference. Any reasonable reference is fine, but extra steps are required if you use the average of all sites as the reference (see Makoto’s Preprocessing Pipeline or EEGLAB’s ICA documentation). Create additional bipolar channels for blinks and eye movements (but these will be left out of the decomposition).
    • If you have large amounts of line noise, use the cleanline plugin to eliminate it (see Bigdely-Shamlo et al., 2015 for important details about implementing this tool). This is better than applying a low-pass filter (but a low-pass filter is sufficient for modest amounts of line noise).
    • Scroll through the entire dataset to make sure you know what kinds of artifacts and other problems are present. Determine which channels should be interpolated.
    • Once you have applied the above steps, make a separate copy of the dataset for use in the ICA decomposition. Apply the following steps to that copy, but not to the original data.
      • High-pass filter at 1 Hz (I recommend 48 dB/octave). If you will not be analyzing high-frequency activity (e.g., gamma-band oscillations), you should also apply a low-pass filter at 30 Hz (48 dB/octave).
      • Downsample the data to 100 Hz.
      • Delete time periods during breaks.
      • Delete segments of huge C.R.A.P.
      • Run the ICA decomposition with the runica algorithm, using the ‘extended’, 1 option. Leave out the bipolar channels and any channels that you plan to interpolate.
      • Verify that the ICs are reasonable (e.g., not too many irregular scalp maps, especially for the top half of the ICs). If they’re not, take another look at the data and see if there are problems you missed.
      • When you are first starting out (or switching to a different kind of experimental paradigm or participant population), it’s a good idea to repeat the decomposition and make sure that you get similar results each time. If you don’t, then the decomposition is not working well. The RELICA plugin can be used to provide a quantitative assessment of the reliability of the decomposition.
      • If you can’t get a good decomposition, check to make sure that you have enough data. The informal rule is that the number of time points in the dataset must be at least 20 x (# channels)2 (assuming that your original data were sampled at ~250 Hz). If you don’t have enough data, one option is to apply PCA first to reduce the dimensionality of your data. However, this can create significant problems (Artoni et al., 2018), and you shouldn’t use it unless you have more than 128 channels.
    • Transfer the ICA weights to the original version of the dataset (the version right before you made the copy).
    • Evaluate the ICs with the following steps:
      • Examine the ICs carefully to make sure everything looks OK and to identify the key ICs. This includes looking at the scalp map, the frequency spectrum, and the time course heat map for each key IC, and then scrolling through the IC activations and voltages simultaneously to see what voltage changes co-occur with the key ICs.
      • Compute SME values corresponding to your planned data analysis (e.g., mean amplitude from 125-225 for the MMN) before versus after correcting for each IC to see if correction actually improves your data quality.
      • For each IC that you may want to remove, reconstruct your data using only that IC. Then average the data and see if that IC varies systematically across conditions (or groups in a between-subjects design, but that requires making grand averages across participants).
    • Remove the ICs that correspond to clear, well-understood artifacts, have been well isolated by ICA, and are actually problematic (e.g., reduce your data quality or differ across conditions in the averaged ERPs).
    • Interpolate the bad channels that you previously identified.
    • After performing artifact correction, you should perform artifact detection on the epoched data. At a minimum, you want to eliminate epochs with C.R.A.P., because these are not handled well by artifact correction. In visual experiments, you should also mark and reject epochs with blinks and eye movements that might have interfered with the perception of the stimuli (e.g., between -200 and 400 ms). You can use the bipolar EOG channels for this because they were not corrected.
    • As a final check, you can apply artifact detection/rejection to your data instead of artifact correction and then compare the grand averages from these two approaches. Ideally, the grand averages should be similar, but noisier for the rejection version than for the correction version (because fewer trials are available). If you see large differences between the rejected and corrected versions, this may indicate that the correction has reduced an important source of neural activity (because your artifact ICs contained a mixture of brain activity and artifacts) or that it has failed to fully correct for the artifacts.

    I’d like to say a few words about how interpolation interacts with artifact correction. You don’t want the “bad channels” to mess up the ICA decomposition, so these channels need to be excluded from the decomposition stage. You’ll then perform the interpolation after the decomposition has been performed and the data have been corrected. That way, the interpolated channels will reflect the corrected data. My lab spent about 20 minutes one day talking through the different possible orders of steps for combining interpolation and correction, and this approach was clearly the best.


    This page titled 9.9: General Recommendations is shared under a CC BY 4.0 license and was authored, remixed, and/or curated by Steven J Luck directly on the LibreTexts platform.

    • Was this article helpful?