diff --git a/.buildinfo b/.buildinfo index 3ed22fc2..d59181d7 100644 --- a/.buildinfo +++ b/.buildinfo @@ -1,4 +1,4 @@ # Sphinx build info version 1 # This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. -config: 0b0845f06aa0ea4305158e51e2e10944 +config: 94b4cc70cc05f65dc8cc2bd92a8c3d1e tags: 645f666f9bcd5a90fca523b33c5a78b7 diff --git a/_downloads/07fcc19ba03226cd3d83d4e40ec44385/auto_examples_python.zip b/_downloads/07fcc19ba03226cd3d83d4e40ec44385/auto_examples_python.zip index 61d2c7e6..9ec6e1a2 100644 Binary files a/_downloads/07fcc19ba03226cd3d83d4e40ec44385/auto_examples_python.zip and b/_downloads/07fcc19ba03226cd3d83d4e40ec44385/auto_examples_python.zip differ diff --git a/_downloads/2cbf7eec12f4415419df2cf85cbe5c5b/visual_p300_python.zip b/_downloads/2cbf7eec12f4415419df2cf85cbe5c5b/visual_p300_python.zip index 0f1a8209..b30bcdf7 100644 Binary files a/_downloads/2cbf7eec12f4415419df2cf85cbe5c5b/visual_p300_python.zip and b/_downloads/2cbf7eec12f4415419df2cf85cbe5c5b/visual_p300_python.zip differ diff --git a/_downloads/2f1a411d3414306e436c6540c86910c5/00x__n170_run_experiment.ipynb b/_downloads/2f1a411d3414306e436c6540c86910c5/00x__n170_run_experiment.ipynb index 4ccbaaf4..fc9f97ed 100644 --- a/_downloads/2f1a411d3414306e436c6540c86910c5/00x__n170_run_experiment.ipynb +++ b/_downloads/2f1a411d3414306e436c6540c86910c5/00x__n170_run_experiment.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# N170 run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run \nan experiment. \n" + "\n# N170 run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-expy, and how to run \nan experiment. \n" ] }, { diff --git a/_downloads/2fc210914b93a8aaaaa1a6667585ad74/visual_cueing_jupyter.zip b/_downloads/2fc210914b93a8aaaaa1a6667585ad74/visual_cueing_jupyter.zip index 9dc938ee..68e819b5 100644 Binary files a/_downloads/2fc210914b93a8aaaaa1a6667585ad74/visual_cueing_jupyter.zip and b/_downloads/2fc210914b93a8aaaaa1a6667585ad74/visual_cueing_jupyter.zip differ diff --git a/_downloads/335ce423c80436163e4a75b13e3dba64/00x__ssvep_run_experiment.py b/_downloads/335ce423c80436163e4a75b13e3dba64/00x__ssvep_run_experiment.py index dfb8c046..ce23c176 100644 --- a/_downloads/335ce423c80436163e4a75b13e3dba64/00x__ssvep_run_experiment.py +++ b/_downloads/335ce423c80436163e4a75b13e3dba64/00x__ssvep_run_experiment.py @@ -2,7 +2,7 @@ SSVEP run experiment =============================== -This example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run +This example demonstrates the initiation of an EEG stream with eeg-expy, and how to run an experiment. """ @@ -40,4 +40,4 @@ # --------------------- # ssvep = VisualSSVEP(duration=record_duration, eeg=eeg_device, save_fn=save_fn) -ssvep.run() \ No newline at end of file +ssvep.run() diff --git a/_downloads/482813616f7e52f19737a9e9e4714600/01r__ssvep_viz.ipynb b/_downloads/482813616f7e52f19737a9e9e4714600/01r__ssvep_viz.ipynb index d197e645..f73ef5de 100644 --- a/_downloads/482813616f7e52f19737a9e9e4714600/01r__ssvep_viz.ipynb +++ b/_downloads/482813616f7e52f19737a9e9e4714600/01r__ssvep_viz.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# SSVEP Visualization\n\nThis example demonstrates loading, organizing, and visualizing data from the steady-state visual evoked potentials (SSVEP) experiment. \n\nThe data used is the first subject and first session of the one of the eeg-notebooks ssvep example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figures show the visual frequencies appearing in the measured power spectrum. \n" + "\n# SSVEP Visualization\n\nThis example demonstrates loading, organizing, and visualizing data from the steady-state visual evoked potentials (SSVEP) experiment. \n\nThe data used is the first subject and first session of the one of the eeg-expy ssvep example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figures show the visual frequencies appearing in the measured power spectrum. \n" ] }, { @@ -22,7 +22,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Load Data\n ---------------------\n\n We will use the eeg-notebooks SSVEP example dataset\n\n Note that if you are running this locally, the following cell will download\n the example dataset, if you do not already have it.\n\n##################################################################################################\n\n" + "Load Data\n ---------------------\n\n We will use the eeg-expy SSVEP example dataset\n\n Note that if you are running this locally, the following cell will download\n the example dataset, if you do not already have it.\n\n##################################################################################################\n\n" ] }, { diff --git a/_downloads/4c0d23639fbd212d64cd06010552da22/visual_n170_python.zip b/_downloads/4c0d23639fbd212d64cd06010552da22/visual_n170_python.zip index 7417a01c..be504bec 100644 Binary files a/_downloads/4c0d23639fbd212d64cd06010552da22/visual_n170_python.zip and b/_downloads/4c0d23639fbd212d64cd06010552da22/visual_n170_python.zip differ diff --git a/_downloads/67001aab4aa80bdc4a44406add4d085e/visual_cueing_python.zip b/_downloads/67001aab4aa80bdc4a44406add4d085e/visual_cueing_python.zip index 33e73a87..10484290 100644 Binary files a/_downloads/67001aab4aa80bdc4a44406add4d085e/visual_cueing_python.zip and b/_downloads/67001aab4aa80bdc4a44406add4d085e/visual_cueing_python.zip differ diff --git a/_downloads/679acb9da15a2bd0f1cf78d5dea6995b/00x__n170_run_experiment.py b/_downloads/679acb9da15a2bd0f1cf78d5dea6995b/00x__n170_run_experiment.py index 51b6d2c6..03249e5d 100644 --- a/_downloads/679acb9da15a2bd0f1cf78d5dea6995b/00x__n170_run_experiment.py +++ b/_downloads/679acb9da15a2bd0f1cf78d5dea6995b/00x__n170_run_experiment.py @@ -2,7 +2,7 @@ N170 run experiment =============================== -This example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run +This example demonstrates the initiation of an EEG stream with eeg-expy, and how to run an experiment. """ diff --git a/_downloads/6c7787744c46f6694529791768174f35/02r__ssvep_decoding.py b/_downloads/6c7787744c46f6694529791768174f35/02r__ssvep_decoding.py index 4d6e8155..22523a17 100644 --- a/_downloads/6c7787744c46f6694529791768174f35/02r__ssvep_decoding.py +++ b/_downloads/6c7787744c46f6694529791768174f35/02r__ssvep_decoding.py @@ -2,7 +2,7 @@ SSVEP Decoding =============================== -This notebook runs only the data analysis part of N170 notebook. +This notebook runs only the data analysis part of experiment. Look at the notes to see how this can be run on the web with binder or google collab. diff --git a/_downloads/6f1e7a639e0699d6164445b55e6c116d/auto_examples_jupyter.zip b/_downloads/6f1e7a639e0699d6164445b55e6c116d/auto_examples_jupyter.zip index 7f94d2d8..6acb05ee 100644 Binary files a/_downloads/6f1e7a639e0699d6164445b55e6c116d/auto_examples_jupyter.zip and b/_downloads/6f1e7a639e0699d6164445b55e6c116d/auto_examples_jupyter.zip differ diff --git a/_downloads/89f5ff349033ec3c3d48bcfb2c3c3de2/01r__ssvep_viz.py b/_downloads/89f5ff349033ec3c3d48bcfb2c3c3de2/01r__ssvep_viz.py index 22da3b6a..d9a20692 100644 --- a/_downloads/89f5ff349033ec3c3d48bcfb2c3c3de2/01r__ssvep_viz.py +++ b/_downloads/89f5ff349033ec3c3d48bcfb2c3c3de2/01r__ssvep_viz.py @@ -4,7 +4,7 @@ This example demonstrates loading, organizing, and visualizing data from the steady-state visual evoked potentials (SSVEP) experiment. -The data used is the first subject and first session of the one of the eeg-notebooks ssvep example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. +The data used is the first subject and first session of the one of the eeg-expy ssvep example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. We first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present in the specified data directory, they will be quickly downloaded from the cloud. @@ -38,7 +38,7 @@ # Load Data # --------------------- # -# We will use the eeg-notebooks SSVEP example dataset +# We will use the eeg-expy SSVEP example dataset # # Note that if you are running this locally, the following cell will download # the example dataset, if you do not already have it. diff --git a/_downloads/8d83acdbf2986ad81d4d9e035ff593ef/01r__n170_viz.ipynb b/_downloads/8d83acdbf2986ad81d4d9e035ff593ef/01r__n170_viz.ipynb index 00fdb815..d32078a1 100644 --- a/_downloads/8d83acdbf2986ad81d4d9e035ff593ef/01r__n170_viz.ipynb +++ b/_downloads/8d83acdbf2986ad81d4d9e035ff593ef/01r__n170_viz.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# N170 Load and Visualize Data\n\nThis example demonstrates loading, organizing, and visualizing ERP response data from the visual N170 experiment. \n\nImages of faces and houses are shown in a rapid serial visual presentation (RSVP) stream.\n\nThe data used is the first subject and first session of the one of the eeg-notebooks N170 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). \nThis session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figure plotted at the end shows the N170 response ERP waveform. \n" + "\n# N170 Load and Visualize Data\n\nThis example demonstrates loading, organizing, and visualizing ERP response data from the visual N170 experiment. \n\nImages of faces and houses are shown in a rapid serial visual presentation (RSVP) stream.\n\nThe data used is the first subject and first session of the one of the eeg-expy N170 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). \nThis session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figure plotted at the end shows the N170 response ERP waveform. \n" ] }, { @@ -29,7 +29,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Load Data\n\nWe will use the eeg-notebooks N170 example dataset\n\nNote that if you are running this locally, the following cell will download\nthe example dataset, if you do not already have it.\n\n\n" + "## Load Data\n\nWe will use the eeg-expy N170 example dataset\n\nNote that if you are running this locally, the following cell will download\nthe example dataset, if you do not already have it.\n\n\n" ] }, { diff --git a/_downloads/a74331ba25f1e4b93e2ecf2b3efd4a13/02r__ssvep_decoding.ipynb b/_downloads/a74331ba25f1e4b93e2ecf2b3efd4a13/02r__ssvep_decoding.ipynb index e5c4128b..dd4f9f6a 100644 --- a/_downloads/a74331ba25f1e4b93e2ecf2b3efd4a13/02r__ssvep_decoding.ipynb +++ b/_downloads/a74331ba25f1e4b93e2ecf2b3efd4a13/02r__ssvep_decoding.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# SSVEP Decoding\n\nThis notebook runs only the data analysis part of N170 notebook.\n\nLook at the notes to see how this can be run on the web with binder or google collab.\n\nAll of the additional notes are removed; only the code cells are kept.\n" + "\n# SSVEP Decoding\n\nThis notebook runs only the data analysis part of experiment.\n\nLook at the notes to see how this can be run on the web with binder or google collab.\n\nAll of the additional notes are removed; only the code cells are kept.\n" ] }, { diff --git a/_downloads/b1193b749e4caeff4dbfdf0c20ae6801/visual_ssvep_jupyter.zip b/_downloads/b1193b749e4caeff4dbfdf0c20ae6801/visual_ssvep_jupyter.zip index 329fe4c0..59cc1d58 100644 Binary files a/_downloads/b1193b749e4caeff4dbfdf0c20ae6801/visual_ssvep_jupyter.zip and b/_downloads/b1193b749e4caeff4dbfdf0c20ae6801/visual_ssvep_jupyter.zip differ diff --git a/_downloads/b1c28450479cda89a20f021d24809e55/visual_n170_jupyter.zip b/_downloads/b1c28450479cda89a20f021d24809e55/visual_n170_jupyter.zip index b7f9a84a..3e7c7127 100644 Binary files a/_downloads/b1c28450479cda89a20f021d24809e55/visual_n170_jupyter.zip and b/_downloads/b1c28450479cda89a20f021d24809e55/visual_n170_jupyter.zip differ diff --git a/_downloads/b2ef39fb5bc5abfa985af6266c497c15/01r__p300_viz.ipynb b/_downloads/b2ef39fb5bc5abfa985af6266c497c15/01r__p300_viz.ipynb index 5339b78f..f49f0839 100644 --- a/_downloads/b2ef39fb5bc5abfa985af6266c497c15/01r__p300_viz.ipynb +++ b/_downloads/b2ef39fb5bc5abfa985af6266c497c15/01r__p300_viz.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# P300 Load and Visualize Data\n\nThis example demonstrates loading, organizing, and visualizing ERP response data from the visual P300 experiment. The experiment uses a visual oddball paradigm. Images of cats and dogs are shwn in a rapid serial visual presentation (RSVP) stream, with cats and dogs categorized respectively as 'targets' or 'non-targets', according to which has high or low probability of occurring, respectively. \n\nThe data used is the first subject and first session of the one of the eeg-notebooks P300 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figure plotted at the end shows the P300 response ERP waveform. \n" + "\n# P300 Load and Visualize Data\n\nThis example demonstrates loading, organizing, and visualizing ERP response data from the visual P300 experiment. The experiment uses a visual oddball paradigm. Images of cats and dogs are shwn in a rapid serial visual presentation (RSVP) stream, with cats and dogs categorized respectively as 'targets' or 'non-targets', according to which has high or low probability of occurring, respectively. \n\nThe data used is the first subject and first session of the one of the eeg-expy P300 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. \n\nWe first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present \nin the specified data directory, they will be quickly downloaded from the cloud. \n\nAfter loading the data, we place it in an MNE `Epochs` object, and obtain the trial-averaged response. \n\nThe final figure plotted at the end shows the P300 response ERP waveform. \n" ] }, { @@ -29,7 +29,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Load Data\n ---------------------\n\n We will use the eeg-notebooks N170 example dataset\n\n Note that if you are running this locally, the following cell will download\n the example dataset, if you do not already have it.\n\n##################################################################################################\n\n" + "Load Data\n ---------------------\n\n We will use the eeg-expy N170 example dataset\n\n Note that if you are running this locally, the following cell will download\n the example dataset, if you do not already have it.\n\n##################################################################################################\n\n" ] }, { diff --git a/_downloads/b6763bc95c5e2ede1c4ce186cc0c606a/00x__p300_run_experiment.ipynb b/_downloads/b6763bc95c5e2ede1c4ce186cc0c606a/00x__p300_run_experiment.ipynb index d2c71277..1f2385d3 100644 --- a/_downloads/b6763bc95c5e2ede1c4ce186cc0c606a/00x__p300_run_experiment.ipynb +++ b/_downloads/b6763bc95c5e2ede1c4ce186cc0c606a/00x__p300_run_experiment.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# P300 run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run \nan experiment. \n" + "\n# P300 run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-expy, and how to run \nan experiment. \n" ] }, { diff --git a/_downloads/b6fdaf0dd3351d46675abacd38607890/00x__p300_run_experiment.py b/_downloads/b6fdaf0dd3351d46675abacd38607890/00x__p300_run_experiment.py index 754f5340..f5905360 100644 --- a/_downloads/b6fdaf0dd3351d46675abacd38607890/00x__p300_run_experiment.py +++ b/_downloads/b6fdaf0dd3351d46675abacd38607890/00x__p300_run_experiment.py @@ -2,7 +2,7 @@ P300 run experiment =============================== -This example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run +This example demonstrates the initiation of an EEG stream with eeg-expy, and how to run an experiment. """ diff --git a/_downloads/bffc7389ff14934d31aa05a911f58bf0/01r__cueing_singlesub_analysis.ipynb b/_downloads/bffc7389ff14934d31aa05a911f58bf0/01r__cueing_singlesub_analysis.ipynb index 1dc876ed..e3912980 100644 --- a/_downloads/bffc7389ff14934d31aa05a911f58bf0/01r__cueing_singlesub_analysis.ipynb +++ b/_downloads/bffc7389ff14934d31aa05a911f58bf0/01r__cueing_singlesub_analysis.ipynb @@ -29,7 +29,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Load Data\n\nWe will use the eeg-notebooks visual cueing example dataset\n\n\n" + "## Load Data\n\nWe will use the eeg-expy visual cueing example dataset\n\n\n" ] }, { diff --git a/_downloads/c8168bec3b40b12111ac87ce4dfeac13/01r__cueing_singlesub_analysis.py b/_downloads/c8168bec3b40b12111ac87ce4dfeac13/01r__cueing_singlesub_analysis.py index b3b4cf69..cb4a0082 100644 --- a/_downloads/c8168bec3b40b12111ac87ce4dfeac13/01r__cueing_singlesub_analysis.py +++ b/_downloads/c8168bec3b40b12111ac87ce4dfeac13/01r__cueing_singlesub_analysis.py @@ -35,7 +35,7 @@ # Load Data # --------------------- # -# We will use the eeg-notebooks visual cueing example dataset +# We will use the eeg-expy visual cueing example dataset # eegnb_data_path = os.path.join(os.path.expanduser('~/'),'.eegnb', 'data') diff --git a/_downloads/c9435ee669e38dd54bad678a26f2c8c1/00x__ssvep_run_experiment.ipynb b/_downloads/c9435ee669e38dd54bad678a26f2c8c1/00x__ssvep_run_experiment.ipynb index faf9ed5e..1a31750d 100644 --- a/_downloads/c9435ee669e38dd54bad678a26f2c8c1/00x__ssvep_run_experiment.ipynb +++ b/_downloads/c9435ee669e38dd54bad678a26f2c8c1/00x__ssvep_run_experiment.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "\n# SSVEP run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-notebooks, and how to run \nan experiment. \n" + "\n# SSVEP run experiment\n\nThis example demonstrates the initiation of an EEG stream with eeg-expy, and how to run \nan experiment. \n" ] }, { diff --git a/_downloads/d8a61e599ef859175e65b658c0800a68/visual_ssvep_python.zip b/_downloads/d8a61e599ef859175e65b658c0800a68/visual_ssvep_python.zip index 0636338d..9da81123 100644 Binary files a/_downloads/d8a61e599ef859175e65b658c0800a68/visual_ssvep_python.zip and b/_downloads/d8a61e599ef859175e65b658c0800a68/visual_ssvep_python.zip differ diff --git a/_downloads/d8e5dee04f613448d49984a260dabc0b/visual_p300_jupyter.zip b/_downloads/d8e5dee04f613448d49984a260dabc0b/visual_p300_jupyter.zip index 2a292f83..6f50fb80 100644 Binary files a/_downloads/d8e5dee04f613448d49984a260dabc0b/visual_p300_jupyter.zip and b/_downloads/d8e5dee04f613448d49984a260dabc0b/visual_p300_jupyter.zip differ diff --git a/_downloads/db7df1055da643ad96640e6377001dbf/01r__n170_viz.py b/_downloads/db7df1055da643ad96640e6377001dbf/01r__n170_viz.py index ca4a3c02..d16a22ab 100644 --- a/_downloads/db7df1055da643ad96640e6377001dbf/01r__n170_viz.py +++ b/_downloads/db7df1055da643ad96640e6377001dbf/01r__n170_viz.py @@ -6,7 +6,7 @@ Images of faces and houses are shown in a rapid serial visual presentation (RSVP) stream. -The data used is the first subject and first session of the one of the eeg-notebooks N170 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). +The data used is the first subject and first session of the one of the eeg-expy N170 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. We first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present @@ -42,7 +42,7 @@ # Load Data # --------------------- # -# We will use the eeg-notebooks N170 example dataset +# We will use the eeg-expy N170 example dataset # # Note that if you are running this locally, the following cell will download # the example dataset, if you do not already have it. diff --git a/_downloads/e500d6e9072380c7b8d43c4a40d8e4f1/01r__p300_viz.py b/_downloads/e500d6e9072380c7b8d43c4a40d8e4f1/01r__p300_viz.py index 2b0f3ac8..1a4e6cdc 100644 --- a/_downloads/e500d6e9072380c7b8d43c4a40d8e4f1/01r__p300_viz.py +++ b/_downloads/e500d6e9072380c7b8d43c4a40d8e4f1/01r__p300_viz.py @@ -4,7 +4,7 @@ This example demonstrates loading, organizing, and visualizing ERP response data from the visual P300 experiment. The experiment uses a visual oddball paradigm. Images of cats and dogs are shwn in a rapid serial visual presentation (RSVP) stream, with cats and dogs categorized respectively as 'targets' or 'non-targets', according to which has high or low probability of occurring, respectively. -The data used is the first subject and first session of the one of the eeg-notebooks P300 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. +The data used is the first subject and first session of the one of the eeg-expy P300 example datasets, recorded using the InteraXon MUSE EEG headset (2016 model). This session consists of six two-minute blocks of continuous recording. We first use the `fetch_datasets` to obtain a list of filenames. If these files are not already present in the specified data directory, they will be quickly downloaded from the cloud. @@ -39,7 +39,7 @@ # Load Data # --------------------- # -# We will use the eeg-notebooks N170 example dataset +# We will use the eeg-expy N170 example dataset # # Note that if you are running this locally, the following cell will download # the example dataset, if you do not already have it. diff --git a/_images/sphx_glr_01r__cueing_singlesub_analysis_004.png b/_images/sphx_glr_01r__cueing_singlesub_analysis_004.png index 45523895..d9c8ddd0 100644 Binary files a/_images/sphx_glr_01r__cueing_singlesub_analysis_004.png and b/_images/sphx_glr_01r__cueing_singlesub_analysis_004.png differ diff --git a/_images/sphx_glr_01r__cueing_singlesub_analysis_012.png b/_images/sphx_glr_01r__cueing_singlesub_analysis_012.png index 1d0b1b10..77b89a3f 100644 Binary files a/_images/sphx_glr_01r__cueing_singlesub_analysis_012.png and b/_images/sphx_glr_01r__cueing_singlesub_analysis_012.png differ diff --git a/_images/sphx_glr_01r__n170_viz_003.png b/_images/sphx_glr_01r__n170_viz_003.png index 4b2fa380..da92cf41 100644 Binary files a/_images/sphx_glr_01r__n170_viz_003.png and b/_images/sphx_glr_01r__n170_viz_003.png differ diff --git a/_images/sphx_glr_01r__n170_viz_thumb.png b/_images/sphx_glr_01r__n170_viz_thumb.png index 778bd26e..2c056894 100644 Binary files a/_images/sphx_glr_01r__n170_viz_thumb.png and b/_images/sphx_glr_01r__n170_viz_thumb.png differ diff --git a/_images/sphx_glr_01r__p300_viz_003.png b/_images/sphx_glr_01r__p300_viz_003.png index 759002f5..d2808792 100644 Binary files a/_images/sphx_glr_01r__p300_viz_003.png and b/_images/sphx_glr_01r__p300_viz_003.png differ diff --git a/_images/sphx_glr_01r__p300_viz_thumb.png b/_images/sphx_glr_01r__p300_viz_thumb.png index dda85e5b..1e1c7f56 100644 Binary files a/_images/sphx_glr_01r__p300_viz_thumb.png and b/_images/sphx_glr_01r__p300_viz_thumb.png differ diff --git a/_images/sphx_glr_02r__n170_decoding_001.png b/_images/sphx_glr_02r__n170_decoding_001.png index 09e6c2ae..13360bc7 100644 Binary files a/_images/sphx_glr_02r__n170_decoding_001.png and b/_images/sphx_glr_02r__n170_decoding_001.png differ diff --git a/_images/sphx_glr_02r__n170_decoding_thumb.png b/_images/sphx_glr_02r__n170_decoding_thumb.png index ed59ad7c..b84393b9 100644 Binary files a/_images/sphx_glr_02r__n170_decoding_thumb.png and b/_images/sphx_glr_02r__n170_decoding_thumb.png differ diff --git a/_images/sphx_glr_02r__p300_decoding_001.png b/_images/sphx_glr_02r__p300_decoding_001.png index 04b75385..db7ab85e 100644 Binary files a/_images/sphx_glr_02r__p300_decoding_001.png and b/_images/sphx_glr_02r__p300_decoding_001.png differ diff --git a/_images/sphx_glr_02r__p300_decoding_thumb.png b/_images/sphx_glr_02r__p300_decoding_thumb.png index 4f5bb04c..da4336ca 100644 Binary files a/_images/sphx_glr_02r__p300_decoding_thumb.png and b/_images/sphx_glr_02r__p300_decoding_thumb.png differ diff --git a/_images/sphx_glr_02r__ssvep_decoding_001.png b/_images/sphx_glr_02r__ssvep_decoding_001.png index e4b65806..33dac76a 100644 Binary files a/_images/sphx_glr_02r__ssvep_decoding_001.png and b/_images/sphx_glr_02r__ssvep_decoding_001.png differ diff --git a/_images/sphx_glr_02r__ssvep_decoding_thumb.png b/_images/sphx_glr_02r__ssvep_decoding_thumb.png index 6d7f7265..fa1b8180 100644 Binary files a/_images/sphx_glr_02r__ssvep_decoding_thumb.png and b/_images/sphx_glr_02r__ssvep_decoding_thumb.png differ diff --git a/auto_examples/index.html b/auto_examples/index.html index 210832f0..4df371ac 100644 --- a/auto_examples/index.html +++ b/auto_examples/index.html @@ -114,21 +114,21 @@
Cueing Group Analysis Winter 2019
We will use the eeg-notebooks visual cueing example dataset
+We will use the eeg-expy visual cueing example dataset
eegnb_data_path = os.path.join(os.path.expanduser('~/'),'.eegnb', 'data')
cueing_data_path = os.path.join(eegnb_data_path, 'visual-cueing', 'kylemathlab_dev')
@@ -490,25 +490,17 @@ Load DataDownloading...
From (original): https://drive.google.com/uc?id=1ABOVJ9S0BeJOsqdGFnexaTFZ-ZcsIXfQ
-From (redirected): https://drive.usercontent.google.com/download?id=1ABOVJ9S0BeJOsqdGFnexaTFZ-ZcsIXfQ&confirm=t&uuid=463b3df9-c5e2-4391-8730-663992f23028
+From (redirected): https://drive.usercontent.google.com/download?id=1ABOVJ9S0BeJOsqdGFnexaTFZ-ZcsIXfQ&confirm=t&uuid=98d6ad3b-9d3b-4e11-ab00-7acb17a1bfe6
To: /home/runner/.eegnb/data/downloaded_data.zip
0%| | 0.00/102M [00:00<?, ?B/s]
- 4%|▎ | 3.67M/102M [00:00<00:02, 36.4MB/s]
- 9%|▉ | 8.91M/102M [00:00<00:03, 26.5MB/s]
- 17%|█▋ | 17.3M/102M [00:00<00:02, 35.4MB/s]
- 25%|██▌ | 25.7M/102M [00:00<00:02, 37.4MB/s]
- 34%|███▎ | 34.1M/102M [00:00<00:01, 36.2MB/s]
- 42%|████▏ | 42.5M/102M [00:01<00:02, 27.1MB/s]
- 50%|█████ | 50.9M/102M [00:01<00:01, 27.7MB/s]
- 56%|█████▋ | 57.1M/102M [00:01<00:01, 32.2MB/s]
- 65%|██████▍ | 65.5M/102M [00:01<00:00, 40.6MB/s]
- 70%|███████ | 71.3M/102M [00:02<00:00, 42.9MB/s]
- 75%|███████▌ | 76.5M/102M [00:02<00:00, 43.0MB/s]
- 82%|████████▏ | 83.4M/102M [00:02<00:00, 47.0MB/s]
- 89%|████████▉ | 90.7M/102M [00:02<00:00, 53.0MB/s]
- 96%|█████████▌| 97.0M/102M [00:02<00:00, 53.5MB/s]
-100%|██████████| 102M/102M [00:02<00:00, 40.3MB/s]
+ 1%| | 1.05M/102M [00:00<00:11, 9.12MB/s]
+ 11%|█▏ | 11.5M/102M [00:00<00:01, 61.5MB/s]
+ 27%|██▋ | 27.3M/102M [00:00<00:00, 103MB/s]
+ 42%|████▏ | 42.5M/102M [00:00<00:00, 118MB/s]
+ 61%|██████▏ | 62.4M/102M [00:00<00:00, 146MB/s]
+ 83%|████████▎ | 84.4M/102M [00:00<00:00, 161MB/s]
+100%|██████████| 102M/102M [00:00<00:00, 140MB/s]
Loading these files:
@@ -761,7 +753,7 @@ Now we compute and plot the differences
-<matplotlib.patches.Rectangle object at 0x7fd832f63ee0>
+<matplotlib.patches.Rectangle object at 0x7f2e023ff8e0>
@@ -793,7 +785,7 @@ Target EpochingTotal running time of the script: (2 minutes 9.992 seconds)
+Total running time of the script: (2 minutes 5.106 seconds)