diff --git a/.buildinfo b/.buildinfo index 3ed22fc2b..d59181d70 100644 --- a/.buildinfo +++ b/.buildinfo @@ -1,4 +1,4 @@ # Sphinx build info version 1 # This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. -config: 0b0845f06aa0ea4305158e51e2e10944 +config: 94b4cc70cc05f65dc8cc2bd92a8c3d1e tags: 645f666f9bcd5a90fca523b33c5a78b7 diff --git a/_downloads/07fcc19ba03226cd3d83d4e40ec44385/auto_examples_python.zip b/_downloads/07fcc19ba03226cd3d83d4e40ec44385/auto_examples_python.zip index 61d2c7e6b..9ec6e1a2a 100644 Binary files a/_downloads/07fcc19ba03226cd3d83d4e40ec44385/auto_examples_python.zip and b/_downloads/07fcc19ba03226cd3d83d4e40ec44385/auto_examples_python.zip differ diff --git a/_downloads/2cbf7eec12f4415419df2cf85cbe5c5b/visual_p300_python.zip b/_downloads/2cbf7eec12f4415419df2cf85cbe5c5b/visual_p300_python.zip index 0f1a8209e..b30bcdf7d 100644 Binary files a/_downloads/2cbf7eec12f4415419df2cf85cbe5c5b/visual_p300_python.zip and b/_downloads/2cbf7eec12f4415419df2cf85cbe5c5b/visual_p300_python.zip differ diff --git a/_downloads/2f1a411d3414306e436c6540c86910c5/00x__n170_run_experiment.ipynb b/_downloads/2f1a411d3414306e436c6540c86910c5/00x__n170_run_experiment.ipynb index 4ccbaaf4d..fc9f97eda 100644 --- a/_downloads/2f1a411d3414306e436c6540c86910c5/00x__n170_run_experiment.ipynb +++ b/_downloads/2f1a411d3414306e436c6540c86910c5/00x__n170_run_experiment.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# N170 run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run \nan experiment. \n" + "\n# N170 run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-expy, and how to run \nan experiment. \n" ] }, { diff --git a/_downloads/2fc210914b93a8aaaaa1a6667585ad74/visual_cueing_jupyter.zip b/_downloads/2fc210914b93a8aaaaa1a6667585ad74/visual_cueing_jupyter.zip index 9dc938eea..68e819b55 100644 Binary files a/_downloads/2fc210914b93a8aaaaa1a6667585ad74/visual_cueing_jupyter.zip and b/_downloads/2fc210914b93a8aaaaa1a6667585ad74/visual_cueing_jupyter.zip differ diff --git a/_downloads/335ce423c80436163e4a75b13e3dba64/00x__ssvep_run_experiment.py b/_downloads/335ce423c80436163e4a75b13e3dba64/00x__ssvep_run_experiment.py index dfb8c0462..ce23c1766 100644 --- a/_downloads/335ce423c80436163e4a75b13e3dba64/00x__ssvep_run_experiment.py +++ b/_downloads/335ce423c80436163e4a75b13e3dba64/00x__ssvep_run_experiment.py @@ -2,7 +2,7 @@ SSVEP run experiment =============================== -This example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run +This example demonstrates the initiation of an EEG stream with eeg-expy, and how to run an experiment. """ @@ -40,4 +40,4 @@ # --------------------- # ssvep = VisualSSVEP(duration=record_duration, eeg=eeg_device, save_fn=save_fn) -ssvep.run() \ No newline at end of file +ssvep.run() diff --git a/_downloads/482813616f7e52f19737a9e9e4714600/01r__ssvep_viz.ipynb b/_downloads/482813616f7e52f19737a9e9e4714600/01r__ssvep_viz.ipynb index d197e6457..f73ef5de7 100644 --- a/_downloads/482813616f7e52f19737a9e9e4714600/01r__ssvep_viz.ipynb +++ b/_downloads/482813616f7e52f19737a9e9e4714600/01r__ssvep_viz.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# SSVEP Visualization\n\nThis example demonstrates loading, organizing, and visualizing data from the steady-state visual evoked potentials (SSVEP) experiment. \n\nThe data used is the first subject and first session of the one of the eeg-notebooks ssvep example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figures show the visual frequencies appearing in the measured power spectrum. \n" + "\n# SSVEP Visualization\n\nThis example demonstrates loading, organizing, and visualizing data from the steady-state visual evoked potentials (SSVEP) experiment. \n\nThe data used is the first subject and first session of the one of the eeg-expy ssvep example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figures show the visual frequencies appearing in the measured power spectrum. \n" ] }, { @@ -22,7 +22,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Load Data\n ---------------------\n\n We will use the eeg-notebooks SSVEP example dataset\n\n Note that if you are running this locally, the following cell will download\n the example dataset, if you do not already have it.\n\n##################################################################################################\n\n" + "Load Data\n ---------------------\n\n We will use the eeg-expy SSVEP example dataset\n\n Note that if you are running this locally, the following cell will download\n the example dataset, if you do not already have it.\n\n##################################################################################################\n\n" ] }, { diff --git a/_downloads/4c0d23639fbd212d64cd06010552da22/visual_n170_python.zip b/_downloads/4c0d23639fbd212d64cd06010552da22/visual_n170_python.zip index 7417a01c0..be504becb 100644 Binary files a/_downloads/4c0d23639fbd212d64cd06010552da22/visual_n170_python.zip and b/_downloads/4c0d23639fbd212d64cd06010552da22/visual_n170_python.zip differ diff --git a/_downloads/67001aab4aa80bdc4a44406add4d085e/visual_cueing_python.zip b/_downloads/67001aab4aa80bdc4a44406add4d085e/visual_cueing_python.zip index 33e73a871..104842909 100644 Binary files a/_downloads/67001aab4aa80bdc4a44406add4d085e/visual_cueing_python.zip and b/_downloads/67001aab4aa80bdc4a44406add4d085e/visual_cueing_python.zip differ diff --git a/_downloads/679acb9da15a2bd0f1cf78d5dea6995b/00x__n170_run_experiment.py b/_downloads/679acb9da15a2bd0f1cf78d5dea6995b/00x__n170_run_experiment.py index 51b6d2c6e..03249e5dd 100644 --- a/_downloads/679acb9da15a2bd0f1cf78d5dea6995b/00x__n170_run_experiment.py +++ b/_downloads/679acb9da15a2bd0f1cf78d5dea6995b/00x__n170_run_experiment.py @@ -2,7 +2,7 @@ N170 run experiment =============================== -This example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run +This example demonstrates the initiation of an EEG stream with eeg-expy, and how to run an experiment. """ diff --git a/_downloads/6c7787744c46f6694529791768174f35/02r__ssvep_decoding.py b/_downloads/6c7787744c46f6694529791768174f35/02r__ssvep_decoding.py index 4d6e8155b..22523a171 100644 --- a/_downloads/6c7787744c46f6694529791768174f35/02r__ssvep_decoding.py +++ b/_downloads/6c7787744c46f6694529791768174f35/02r__ssvep_decoding.py @@ -2,7 +2,7 @@ SSVEP Decoding =============================== -This notebook runs only the data analysis part of N170 notebook. +This notebook runs only the data analysis part of experiment. Look at the notes to see how this can be run on the web with binder or google collab. diff --git a/_downloads/6f1e7a639e0699d6164445b55e6c116d/auto_examples_jupyter.zip b/_downloads/6f1e7a639e0699d6164445b55e6c116d/auto_examples_jupyter.zip index 7f94d2d8b..6acb05eee 100644 Binary files a/_downloads/6f1e7a639e0699d6164445b55e6c116d/auto_examples_jupyter.zip and b/_downloads/6f1e7a639e0699d6164445b55e6c116d/auto_examples_jupyter.zip differ diff --git a/_downloads/89f5ff349033ec3c3d48bcfb2c3c3de2/01r__ssvep_viz.py b/_downloads/89f5ff349033ec3c3d48bcfb2c3c3de2/01r__ssvep_viz.py index 22da3b6a0..d9a20692b 100644 --- a/_downloads/89f5ff349033ec3c3d48bcfb2c3c3de2/01r__ssvep_viz.py +++ b/_downloads/89f5ff349033ec3c3d48bcfb2c3c3de2/01r__ssvep_viz.py @@ -4,7 +4,7 @@ This example demonstrates loading, organizing, and visualizing data from the steady-state visual evoked potentials (SSVEP) experiment. -The data used is the first subject and first session of the one of the eeg-notebooks ssvep example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. +The data used is the first subject and first session of the one of the eeg-expy ssvep example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. We first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present in the specified data directory, they will be quickly downloaded from the cloud. @@ -38,7 +38,7 @@ # Load Data # --------------------- # -# We will use the eeg-notebooks SSVEP example dataset +# We will use the eeg-expy SSVEP example dataset # # Note that if you are running this locally, the following cell will download # the example dataset, if you do not already have it. diff --git a/_downloads/8d83acdbf2986ad81d4d9e035ff593ef/01r__n170_viz.ipynb b/_downloads/8d83acdbf2986ad81d4d9e035ff593ef/01r__n170_viz.ipynb index 00fdb815c..d32078a10 100644 --- a/_downloads/8d83acdbf2986ad81d4d9e035ff593ef/01r__n170_viz.ipynb +++ b/_downloads/8d83acdbf2986ad81d4d9e035ff593ef/01r__n170_viz.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# N170 Load and Visualize Data\n\nThis example demonstrates loading, organizing, and visualizing ERP response data from the visual N170 experiment. \n\nImages of faces and houses are shown in a rapid serial visual presentation (RSVP) stream.\n\nThe data used is the first subject and first session of the one of the eeg-notebooks N170 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). \nThis session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figure plotted at the end shows the N170 response ERP waveform. \n" + "\n# N170 Load and Visualize Data\n\nThis example demonstrates loading, organizing, and visualizing ERP response data from the visual N170 experiment. \n\nImages of faces and houses are shown in a rapid serial visual presentation (RSVP) stream.\n\nThe data used is the first subject and first session of the one of the eeg-expy N170 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). \nThis session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figure plotted at the end shows the N170 response ERP waveform. \n" ] }, { @@ -29,7 +29,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Load Data\n\nWe will use the eeg-notebooks N170 example dataset\n\nNote that if you are running this locally, the following cell will download\nthe example dataset, if you do not already have it.\n\n\n" + "## Load Data\n\nWe will use the eeg-expy N170 example dataset\n\nNote that if you are running this locally, the following cell will download\nthe example dataset, if you do not already have it.\n\n\n" ] }, { diff --git a/_downloads/a74331ba25f1e4b93e2ecf2b3efd4a13/02r__ssvep_decoding.ipynb b/_downloads/a74331ba25f1e4b93e2ecf2b3efd4a13/02r__ssvep_decoding.ipynb index e5c4128b0..dd4f9f6a2 100644 --- a/_downloads/a74331ba25f1e4b93e2ecf2b3efd4a13/02r__ssvep_decoding.ipynb +++ b/_downloads/a74331ba25f1e4b93e2ecf2b3efd4a13/02r__ssvep_decoding.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# SSVEP Decoding\n\nThis notebook runs only the data analysis part of N170 notebook.\n\nLook at the notes to see how this can be run on the web with binder or google collab.\n\nAll of the additional notes are removed; only the code cells are kept.\n" + "\n# SSVEP Decoding\n\nThis notebook runs only the data analysis part of experiment.\n\nLook at the notes to see how this can be run on the web with binder or google collab.\n\nAll of the additional notes are removed; only the code cells are kept.\n" ] }, { diff --git a/_downloads/b1193b749e4caeff4dbfdf0c20ae6801/visual_ssvep_jupyter.zip b/_downloads/b1193b749e4caeff4dbfdf0c20ae6801/visual_ssvep_jupyter.zip index 329fe4c07..59cc1d580 100644 Binary files a/_downloads/b1193b749e4caeff4dbfdf0c20ae6801/visual_ssvep_jupyter.zip and b/_downloads/b1193b749e4caeff4dbfdf0c20ae6801/visual_ssvep_jupyter.zip differ diff --git a/_downloads/b1c28450479cda89a20f021d24809e55/visual_n170_jupyter.zip b/_downloads/b1c28450479cda89a20f021d24809e55/visual_n170_jupyter.zip index b7f9a84aa..3e7c71274 100644 Binary files a/_downloads/b1c28450479cda89a20f021d24809e55/visual_n170_jupyter.zip and b/_downloads/b1c28450479cda89a20f021d24809e55/visual_n170_jupyter.zip differ diff --git a/_downloads/b2ef39fb5bc5abfa985af6266c497c15/01r__p300_viz.ipynb b/_downloads/b2ef39fb5bc5abfa985af6266c497c15/01r__p300_viz.ipynb index 5339b78f6..f49f08393 100644 --- a/_downloads/b2ef39fb5bc5abfa985af6266c497c15/01r__p300_viz.ipynb +++ b/_downloads/b2ef39fb5bc5abfa985af6266c497c15/01r__p300_viz.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# P300 Load and Visualize Data\n\nThis example demonstrates loading, organizing, and visualizing ERP response data from the visual P300 experiment. The experiment uses a visual oddball paradigm. Images of cats and dogs are shwn in a rapid serial visual presentation (RSVP) stream, with cats and dogs categorized respectively as 'targets' or 'non-targets', according to which has high or low probability of occurring, respectively. \n\nThe data used is the first subject and first session of the one of the eeg-notebooks P300 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figure plotted at the end shows the P300 response ERP waveform. \n" + "\n# P300 Load and Visualize Data\n\nThis example demonstrates loading, organizing, and visualizing ERP response data from the visual P300 experiment. The experiment uses a visual oddball paradigm. Images of cats and dogs are shwn in a rapid serial visual presentation (RSVP) stream, with cats and dogs categorized respectively as 'targets' or 'non-targets', according to which has high or low probability of occurring, respectively. \n\nThe data used is the first subject and first session of the one of the eeg-expy P300 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figure plotted at the end shows the P300 response ERP waveform. \n" ] }, { @@ -29,7 +29,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Load Data\n ---------------------\n\n We will use the eeg-notebooks N170 example dataset\n\n Note that if you are running this locally, the following cell will download\n the example dataset, if you do not already have it.\n\n##################################################################################################\n\n" + "Load Data\n ---------------------\n\n We will use the eeg-expy N170 example dataset\n\n Note that if you are running this locally, the following cell will download\n the example dataset, if you do not already have it.\n\n##################################################################################################\n\n" ] }, { diff --git a/_downloads/b6763bc95c5e2ede1c4ce186cc0c606a/00x__p300_run_experiment.ipynb b/_downloads/b6763bc95c5e2ede1c4ce186cc0c606a/00x__p300_run_experiment.ipynb index d2c712779..1f2385d3e 100644 --- a/_downloads/b6763bc95c5e2ede1c4ce186cc0c606a/00x__p300_run_experiment.ipynb +++ b/_downloads/b6763bc95c5e2ede1c4ce186cc0c606a/00x__p300_run_experiment.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# P300 run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run \nan experiment. \n" + "\n# P300 run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-expy, and how to run \nan experiment. \n" ] }, { diff --git a/_downloads/b6fdaf0dd3351d46675abacd38607890/00x__p300_run_experiment.py b/_downloads/b6fdaf0dd3351d46675abacd38607890/00x__p300_run_experiment.py index 754f5340f..f5905360c 100644 --- a/_downloads/b6fdaf0dd3351d46675abacd38607890/00x__p300_run_experiment.py +++ b/_downloads/b6fdaf0dd3351d46675abacd38607890/00x__p300_run_experiment.py @@ -2,7 +2,7 @@ P300 run experiment =============================== -This example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run +This example demonstrates the initiation of an EEG stream with eeg-expy, and how to run an experiment. """ diff --git a/_downloads/bffc7389ff14934d31aa05a911f58bf0/01r__cueing_singlesub_analysis.ipynb b/_downloads/bffc7389ff14934d31aa05a911f58bf0/01r__cueing_singlesub_analysis.ipynb index 1dc876ed5..e39129800 100644 --- a/_downloads/bffc7389ff14934d31aa05a911f58bf0/01r__cueing_singlesub_analysis.ipynb +++ b/_downloads/bffc7389ff14934d31aa05a911f58bf0/01r__cueing_singlesub_analysis.ipynb @@ -29,7 +29,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Load Data\n\nWe will use the eeg-notebooks visual cueing example dataset\n\n\n" + "## Load Data\n\nWe will use the eeg-expy visual cueing example dataset\n\n\n" ] }, { diff --git a/_downloads/c8168bec3b40b12111ac87ce4dfeac13/01r__cueing_singlesub_analysis.py b/_downloads/c8168bec3b40b12111ac87ce4dfeac13/01r__cueing_singlesub_analysis.py index b3b4cf693..cb4a00827 100644 --- a/_downloads/c8168bec3b40b12111ac87ce4dfeac13/01r__cueing_singlesub_analysis.py +++ b/_downloads/c8168bec3b40b12111ac87ce4dfeac13/01r__cueing_singlesub_analysis.py @@ -35,7 +35,7 @@ # Load Data # --------------------- # -# We will use the eeg-notebooks visual cueing example dataset +# We will use the eeg-expy visual cueing example dataset # eegnb_data_path = os.path.join(os.path.expanduser('~/'),'.eegnb', 'data') diff --git a/_downloads/c9435ee669e38dd54bad678a26f2c8c1/00x__ssvep_run_experiment.ipynb b/_downloads/c9435ee669e38dd54bad678a26f2c8c1/00x__ssvep_run_experiment.ipynb index faf9ed5e9..1a31750d7 100644 --- a/_downloads/c9435ee669e38dd54bad678a26f2c8c1/00x__ssvep_run_experiment.ipynb +++ b/_downloads/c9435ee669e38dd54bad678a26f2c8c1/00x__ssvep_run_experiment.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# SSVEP run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run \nan experiment. \n" + "\n# SSVEP run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-expy, and how to run \nan experiment. \n" ] }, { diff --git a/_downloads/d8a61e599ef859175e65b658c0800a68/visual_ssvep_python.zip b/_downloads/d8a61e599ef859175e65b658c0800a68/visual_ssvep_python.zip index 0636338d3..9da81123b 100644 Binary files a/_downloads/d8a61e599ef859175e65b658c0800a68/visual_ssvep_python.zip and b/_downloads/d8a61e599ef859175e65b658c0800a68/visual_ssvep_python.zip differ diff --git a/_downloads/d8e5dee04f613448d49984a260dabc0b/visual_p300_jupyter.zip b/_downloads/d8e5dee04f613448d49984a260dabc0b/visual_p300_jupyter.zip index 2a292f833..6f50fb80c 100644 Binary files a/_downloads/d8e5dee04f613448d49984a260dabc0b/visual_p300_jupyter.zip and b/_downloads/d8e5dee04f613448d49984a260dabc0b/visual_p300_jupyter.zip differ diff --git a/_downloads/db7df1055da643ad96640e6377001dbf/01r__n170_viz.py b/_downloads/db7df1055da643ad96640e6377001dbf/01r__n170_viz.py index ca4a3c024..d16a22ab6 100644 --- a/_downloads/db7df1055da643ad96640e6377001dbf/01r__n170_viz.py +++ b/_downloads/db7df1055da643ad96640e6377001dbf/01r__n170_viz.py @@ -6,7 +6,7 @@ Images of faces and houses are shown in a rapid serial visual presentation (RSVP) stream. -The data used is the first subject and first session of the one of the eeg-notebooks N170 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). +The data used is the first subject and first session of the one of the eeg-expy N170 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. We first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present @@ -42,7 +42,7 @@ # Load Data # --------------------- # -# We will use the eeg-notebooks N170 example dataset +# We will use the eeg-expy N170 example dataset # # Note that if you are running this locally, the following cell will download # the example dataset, if you do not already have it. diff --git a/_downloads/e500d6e9072380c7b8d43c4a40d8e4f1/01r__p300_viz.py b/_downloads/e500d6e9072380c7b8d43c4a40d8e4f1/01r__p300_viz.py index 2b0f3ac8f..1a4e6cdcf 100644 --- a/_downloads/e500d6e9072380c7b8d43c4a40d8e4f1/01r__p300_viz.py +++ b/_downloads/e500d6e9072380c7b8d43c4a40d8e4f1/01r__p300_viz.py @@ -4,7 +4,7 @@ This example demonstrates loading, organizing, and visualizing ERP response data from the visual P300 experiment. The experiment uses a visual oddball paradigm. Images of cats and dogs are shwn in a rapid serial visual presentation (RSVP) stream, with cats and dogs categorized respectively as 'targets' or 'non-targets', according to which has high or low probability of occurring, respectively. -The data used is the first subject and first session of the one of the eeg-notebooks P300 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. +The data used is the first subject and first session of the one of the eeg-expy P300 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. We first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present in the specified data directory, they will be quickly downloaded from the cloud. @@ -39,7 +39,7 @@ # Load Data # --------------------- # -# We will use the eeg-notebooks N170 example dataset +# We will use the eeg-expy N170 example dataset # # Note that if you are running this locally, the following cell will download # the example dataset, if you do not already have it. diff --git a/_images/sphx_glr_01r__cueing_singlesub_analysis_004.png b/_images/sphx_glr_01r__cueing_singlesub_analysis_004.png index 455238957..d9c8ddd0c 100644 Binary files a/_images/sphx_glr_01r__cueing_singlesub_analysis_004.png and b/_images/sphx_glr_01r__cueing_singlesub_analysis_004.png differ diff --git a/_images/sphx_glr_01r__cueing_singlesub_analysis_012.png b/_images/sphx_glr_01r__cueing_singlesub_analysis_012.png index 1d0b1b10e..77b89a3fe 100644 Binary files a/_images/sphx_glr_01r__cueing_singlesub_analysis_012.png and b/_images/sphx_glr_01r__cueing_singlesub_analysis_012.png differ diff --git a/_images/sphx_glr_01r__n170_viz_003.png b/_images/sphx_glr_01r__n170_viz_003.png index 4b2fa3800..da92cf41b 100644 Binary files a/_images/sphx_glr_01r__n170_viz_003.png and b/_images/sphx_glr_01r__n170_viz_003.png differ diff --git a/_images/sphx_glr_01r__n170_viz_thumb.png b/_images/sphx_glr_01r__n170_viz_thumb.png index 778bd26e9..2c056894a 100644 Binary files a/_images/sphx_glr_01r__n170_viz_thumb.png and b/_images/sphx_glr_01r__n170_viz_thumb.png differ diff --git a/_images/sphx_glr_01r__p300_viz_003.png b/_images/sphx_glr_01r__p300_viz_003.png index 759002f51..d2808792d 100644 Binary files a/_images/sphx_glr_01r__p300_viz_003.png and b/_images/sphx_glr_01r__p300_viz_003.png differ diff --git a/_images/sphx_glr_01r__p300_viz_thumb.png b/_images/sphx_glr_01r__p300_viz_thumb.png index dda85e5bd..1e1c7f567 100644 Binary files a/_images/sphx_glr_01r__p300_viz_thumb.png and b/_images/sphx_glr_01r__p300_viz_thumb.png differ diff --git a/_images/sphx_glr_02r__n170_decoding_001.png b/_images/sphx_glr_02r__n170_decoding_001.png index 09e6c2aeb..13360bc71 100644 Binary files a/_images/sphx_glr_02r__n170_decoding_001.png and b/_images/sphx_glr_02r__n170_decoding_001.png differ diff --git a/_images/sphx_glr_02r__n170_decoding_thumb.png b/_images/sphx_glr_02r__n170_decoding_thumb.png index ed59ad7c4..b84393b9d 100644 Binary files a/_images/sphx_glr_02r__n170_decoding_thumb.png and b/_images/sphx_glr_02r__n170_decoding_thumb.png differ diff --git a/_images/sphx_glr_02r__p300_decoding_001.png b/_images/sphx_glr_02r__p300_decoding_001.png index 04b753852..db7ab85e5 100644 Binary files a/_images/sphx_glr_02r__p300_decoding_001.png and b/_images/sphx_glr_02r__p300_decoding_001.png differ diff --git a/_images/sphx_glr_02r__p300_decoding_thumb.png b/_images/sphx_glr_02r__p300_decoding_thumb.png index 4f5bb04c9..da4336ca1 100644 Binary files a/_images/sphx_glr_02r__p300_decoding_thumb.png and b/_images/sphx_glr_02r__p300_decoding_thumb.png differ diff --git a/_images/sphx_glr_02r__ssvep_decoding_001.png b/_images/sphx_glr_02r__ssvep_decoding_001.png index e4b658069..33dac76a6 100644 Binary files a/_images/sphx_glr_02r__ssvep_decoding_001.png and b/_images/sphx_glr_02r__ssvep_decoding_001.png differ diff --git a/_images/sphx_glr_02r__ssvep_decoding_thumb.png b/_images/sphx_glr_02r__ssvep_decoding_thumb.png index 6d7f72657..fa1b81805 100644 Binary files a/_images/sphx_glr_02r__ssvep_decoding_thumb.png and b/_images/sphx_glr_02r__ssvep_decoding_thumb.png differ diff --git a/auto_examples/index.html b/auto_examples/index.html index 210832f0b..4df371acb 100644 --- a/auto_examples/index.html +++ b/auto_examples/index.html @@ -114,21 +114,21 @@
We will use the eeg-notebooks visual cueing example dataset
+We will use the eeg-expy visual cueing example dataset
eegnb_data_path = os.path.join(os.path.expanduser('~/'),'.eegnb', 'data')
cueing_data_path = os.path.join(eegnb_data_path, 'visual-cueing', 'kylemathlab_dev')
@@ -490,25 +490,17 @@ Load DataDownloading...
From (original): https://drive.google.com/uc?id=1ABOVJ9S0BeJOsqdGFnexaTFZ-ZcsIXfQ
-From (redirected): https://drive.usercontent.google.com/download?id=1ABOVJ9S0BeJOsqdGFnexaTFZ-ZcsIXfQ&confirm=t&uuid=463b3df9-c5e2-4391-8730-663992f23028
+From (redirected): https://drive.usercontent.google.com/download?id=1ABOVJ9S0BeJOsqdGFnexaTFZ-ZcsIXfQ&confirm=t&uuid=98d6ad3b-9d3b-4e11-ab00-7acb17a1bfe6
To: /home/runner/.eegnb/data/downloaded_data.zip
0%| | 0.00/102M [00:00<?, ?B/s]
- 4%|▎ | 3.67M/102M [00:00<00:02, 36.4MB/s]
- 9%|▉ | 8.91M/102M [00:00<00:03, 26.5MB/s]
- 17%|█▋ | 17.3M/102M [00:00<00:02, 35.4MB/s]
- 25%|██▌ | 25.7M/102M [00:00<00:02, 37.4MB/s]
- 34%|███▎ | 34.1M/102M [00:00<00:01, 36.2MB/s]
- 42%|████▏ | 42.5M/102M [00:01<00:02, 27.1MB/s]
- 50%|█████ | 50.9M/102M [00:01<00:01, 27.7MB/s]
- 56%|█████▋ | 57.1M/102M [00:01<00:01, 32.2MB/s]
- 65%|██████▍ | 65.5M/102M [00:01<00:00, 40.6MB/s]
- 70%|███████ | 71.3M/102M [00:02<00:00, 42.9MB/s]
- 75%|███████▌ | 76.5M/102M [00:02<00:00, 43.0MB/s]
- 82%|████████▏ | 83.4M/102M [00:02<00:00, 47.0MB/s]
- 89%|████████▉ | 90.7M/102M [00:02<00:00, 53.0MB/s]
- 96%|█████████▌| 97.0M/102M [00:02<00:00, 53.5MB/s]
-100%|██████████| 102M/102M [00:02<00:00, 40.3MB/s]
+ 1%| | 1.05M/102M [00:00<00:11, 9.12MB/s]
+ 11%|█▏ | 11.5M/102M [00:00<00:01, 61.5MB/s]
+ 27%|██▋ | 27.3M/102M [00:00<00:00, 103MB/s]
+ 42%|████▏ | 42.5M/102M [00:00<00:00, 118MB/s]
+ 61%|██████▏ | 62.4M/102M [00:00<00:00, 146MB/s]
+ 83%|████████▎ | 84.4M/102M [00:00<00:00, 161MB/s]
+100%|██████████| 102M/102M [00:00<00:00, 140MB/s]
Loading these files:
@@ -761,7 +753,7 @@ Now we compute and plot the differences

-<matplotlib.patches.Rectangle object at 0x7fd832f63ee0>
+<matplotlib.patches.Rectangle object at 0x7f2e023ff8e0>
@@ -793,7 +785,7 @@ Target EpochingTotal running time of the script: (2 minutes 9.992 seconds)
+Total running time of the script: (2 minutes 5.106 seconds)