Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
49 changes: 25 additions & 24 deletions .github/actions/create-venv-for-tests/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,9 @@ outputs:
runs:
using: 'composite'
steps:
- name: Install uv
uses: astral-sh/setup-uv@v6

# Used by tests for installation of ipykernel.
# Create a venv & register it as a kernel.
# These tests are slow hence will only run on linux.
Expand All @@ -24,27 +27,26 @@ runs:
- name: Create virtual environment without ipykernel
if: inputs.IPyWidgetVersion == '7'
run: |
python -m venv .venvnoreg
python -m venv .venvkernel
uv venv .venvnoreg
uv venv .venvkernel
source .venvnoreg/bin/activate
uv pip install pip
source .venvkernel/bin/activate
python --version
python -c "import sys;print(sys.executable)"
python -m pip install ipykernel
uv pip install pip
uv pip install ipykernel
python -m ipykernel install --user --name .venvkernel --display-name .venvkernel
python -m pip uninstall jedi --yes
python -m pip install jedi==0.17.2
python -m pip install pandas
python -m pip install ipywidgets==7.7.2
uv pip install pandas
uv pip install ipywidgets==7.7.2
python -m venv .venvnokernel
uv venv .venvnokernel
source .venvnokernel/bin/activate
python --version
python -c "import sys;print(sys.executable)"
python -m pip install ipykernel
uv pip install pip
uv pip install ipykernel
python -m ipykernel install --user --name .venvnokernel --display-name .venvnokernel
python -m pip uninstall jedi --yes
python -m pip install jedi==0.17.2
working-directory: src/test/datascience
shell: bash
# Used by tests for installation of ipykernel.
Expand All @@ -56,26 +58,25 @@ runs:
- name: Create virtual environment without ipykernel (ipywidgets 8)
if: inputs.IPyWidgetVersion == '8'
run: |
python -m venv .venvnoreg
python -m venv .venvkernel
uv venv .venvnoreg
uv venv .venvkernel
source .venvnoreg/bin/activate
uv pip install pip
source .venvkernel/bin/activate
python --version
python -c "import sys;print(sys.executable)"
python -m pip install ipykernel
uv pip install pip
uv pip install ipykernel
python -m ipykernel install --user --name .venvkernel --display-name .venvkernel
python -m pip uninstall jedi --yes
python -m pip install jedi==0.17.2
python -m pip install pandas
python -m pip install ipywidgets -U
uv pip install pandas
uv pip install ipywidgets -U
python -m venv .venvnokernel
uv venv .venvnokernel
source .venvnokernel/bin/activate
python --version
python -c "import sys;print(sys.executable)"
python -m pip install ipykernel
uv pip install pip
uv pip install ipykernel
python -m ipykernel install --user --name .venvnokernel --display-name .venvnokernel
python -m pip uninstall jedi --yes
python -m pip install jedi==0.17.2
working-directory: src/test/datascience
shell: bash
10 changes: 9 additions & 1 deletion .github/instructions/typescript.instructions.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,17 @@ const mockInstance = mock<YourType>();
mockInstance.then = undefined; // Ensure 'then' is undefined to prevent hanging
```

## Integration Tests
These tests can be a little slow as VS Code needs to be downloaded and launched.
- Use `npm run test:integration` to run integration tests with local Jupyter Kernels (add `-- -- --grep <pattern>` to filter tests, pay attention to the prefix `-- --`)
- Use `npm run test:integration:web` to run integration tests with Remote Jupyter Kernels and VSCode in browser (`--grep` is not supported)


## Scripts
- Use `npm install` to install dependencies if you changed `package.json`
- Use `npm run test:unittests` for unit tests (add `--grep <pattern>` to filter tests)
- Use `npm run test:unittests` for unit tests (add `-- -- --grep <pattern>` to filter tests, pay attention to the prefix `-- --`)
- Use `npm run test:integration` to run integration tests with local Jupyter Kernels (add `-- -- --grep <pattern>` to filter tests, pay attention to the prefix `-- --`)
- Use `npm run test:integration:web` to run integration tests with Remote Jupyter Kernels and VSCode in browser (`--grep` is not supported)
- Use `npm run lint` to check for linter issues
- Use `npm run format` to check code style
- Use `npm run format-fix` to auto-fix formatting issues
36 changes: 17 additions & 19 deletions .github/workflows/build-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,7 @@ jobs:
run: npm run postinstall

- name: Verify Translation files
run: npm run validateTranslationFiles
run: npx gulp validateTranslationFiles

- name: Run linting on TypeScript code (eslint)
run: npm run lint
Expand All @@ -166,8 +166,8 @@ jobs:
python -m black . --check
working-directory: pythonFiles

- name: Run gulp prePublishNonBundle
run: npm run prePublishNonBundle
- name: Compile
run: npm run compile

- name: Check dependencies
run: npm run checkDependencies
Expand All @@ -176,7 +176,7 @@ jobs:
run: npx gulp validatePackageLockJson

- name: Validate TELEMETRY files
run: npm run validateTelemetry
run: npx gulp validateTelemetry

- name: Verify usage of new Proposed API
if: github.event_name == 'pull_request'
Expand Down Expand Up @@ -249,15 +249,15 @@ jobs:
run: npm run postinstall

- name: Compile if not cached
run: npm run prePublishNonBundle
run: npm run compile
if: steps.out-cache.outputs.cache-hit != 'true'

- name: Run TypeScript unit tests
id: test_unittests
run: npm run test:unittests

- name: Verify there are no unhandled errors
run: npm run verifyUnhandledErrors
run: npx gulp verifyUnhandledErrors

vscodeTests:
name: Tests # These tests run with Python extension & real Jupyter
Expand Down Expand Up @@ -592,7 +592,7 @@ jobs:

# This step is slow.
- name: Compile # if not cached
run: npm run prePublishNonBundle
run: npm run compile
# Do not cache for web tests, as the code generated in the compiled code is different for each matrix in web tests
# Rememeber the compiled code contains injected tests, and the injected tests are different for each matrix in the web
# if: steps.out-cache.outputs.cache-hit != 'true' && matrix.jupyterConnection != 'web'
Expand Down Expand Up @@ -659,7 +659,7 @@ jobs:
- name: Run Native Notebook with VSCode & Jupyter (ubuntu)
uses: GabrielBB/xvfb-action@b706e4e27b14669b486812790492dc50ca16b465 # v1.7
with:
run: ${{ env.xvfbCommand }} npm run testNativeNotebooksInVSCode
run: ${{ env.xvfbCommand }} npm run test:integration
env:
VSC_JUPYTER_FORCE_LOGGING: 1
VSC_PYTHON_FORCE_LOGGING: 1
Expand All @@ -676,13 +676,11 @@ jobs:
- name: Run Notebook Perf Test Without Jupyter
uses: GabrielBB/xvfb-action@b706e4e27b14669b486812790492dc50ca16b465 # v1.7
with:
run: ${{ env.xvfbCommand }} npm run testPerfInVSCode
run: ${{ env.xvfbCommand }} npm run test:performance:notebook
env:
VSC_JUPYTER_FORCE_LOGGING: 1
VSC_PYTHON_FORCE_LOGGING: 1
VSC_JUPYTER_CI_RUN_NON_PYTHON_NB_TEST: 1
VSC_JUPYTER_PERF_TEST: ${{ matrix.matrix == '@notebookPerformance' }}
VSC_JUPYTER_NOTEBOOK_PERF_TEST: ${{ matrix.matrix == '@notebookPerformance' }}
VSC_JUPYTER_CI_TEST_DO_NOT_INSTALL_PYTHON_EXT: ${{ matrix.matrix == '@notebookPerformance' }}
VSC_JUPYTER_REMOTE_NATIVE_TEST: ${{ matrix.jupyterConnection == 'remote' }}
VSC_JUPYTER_NON_RAW_NATIVE_TEST: ${{ matrix.jupyterConnection == 'local' }}
Expand All @@ -696,12 +694,11 @@ jobs:
- name: Run Execution Perf Test With Jupyter
uses: GabrielBB/xvfb-action@b706e4e27b14669b486812790492dc50ca16b465 # v1.7
with:
run: ${{ env.xvfbCommand }} npm run testExecPerfInVSCode
run: ${{ env.xvfbCommand }} npm run test:performance:execution
env:
VSC_JUPYTER_FORCE_LOGGING: 1
VSC_PYTHON_FORCE_LOGGING: 1
VSC_JUPYTER_CI_RUN_NON_PYTHON_NB_TEST: 1
VSC_JUPYTER_PERF_TEST: ${{ matrix.matrix == '@executionPerformance' }}
VSC_JUPYTER_REMOTE_NATIVE_TEST: ${{ matrix.jupyterConnection == 'remote' }}
VSC_JUPYTER_NON_RAW_NATIVE_TEST: ${{ matrix.jupyterConnection == 'local' }}
VSC_JUPYTER_CI_RUN_JAVA_NB_TEST: ${{ matrix.python == 'conda' }}
Expand All @@ -720,7 +717,7 @@ jobs:
- name: Run Native Notebook with VSCode & Jupyter (web)
uses: GabrielBB/xvfb-action@b706e4e27b14669b486812790492dc50ca16b465 # v1.7
with:
run: npm run testWebExtension
run: npm run test:integration:web
env:
VSC_JUPYTER_FORCE_LOGGING: 1
VSC_PYTHON_FORCE_LOGGING: 1
Expand All @@ -735,8 +732,9 @@ jobs:
if: matrix.python != 'noPython' && matrix.os == 'ubuntu-latest' && matrix.jupyterConnection == 'web'

- name: Run Native Notebook with VSCode & Jupyter (windows)
# Running tests on Windows is verys low, hence run only a few of the tests.
run: |
npm run testNativeNotebooksInVSCodeWithoutTestSuffix
npm run test:integration:windows
env:
VSC_JUPYTER_FORCE_LOGGING: 1
VSC_PYTHON_FORCE_LOGGING: 1
Expand All @@ -754,7 +752,7 @@ jobs:
- name: Run Native Notebook with VSCode & Jupyter (without Python)
uses: GabrielBB/xvfb-action@b706e4e27b14669b486812790492dc50ca16b465 # v1.7
with:
run: ${{ env.xvfbCommand }} npm run testNativeNotebooksWithoutPythonInVSCode
run: ${{ env.xvfbCommand }} npm run test:integration:nonpython
env:
VSC_JUPYTER_FORCE_LOGGING: 1
VSC_JUPYTER_CI_RUN_NON_PYTHON_NB_TEST: 1
Expand All @@ -773,7 +771,7 @@ jobs:

- name: Log test results
if: always()
run: npm run printTestResults
run: npx gulp printTestResults

- name: Upload test result, screenshots files
uses: actions/upload-artifact@v4
Expand All @@ -784,7 +782,7 @@ jobs:
retention-days: 60

- name: Verify there are no unhandled errors
run: npm run verifyUnhandledErrors
run: npx gulp verifyUnhandledErrors

smoke-tests:
timeout-minutes: 30
Expand Down Expand Up @@ -857,4 +855,4 @@ jobs:
VSC_JUPYTER_CI_TEST_VSC_CHANNEL: 'insiders'
uses: GabrielBB/xvfb-action@b706e4e27b14669b486812790492dc50ca16b465 # v1.7
with:
run: npm run testSmokeLogged
run: npm run test:smoke
11 changes: 11 additions & 0 deletions .github/workflows/copilot-setup-steps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -72,3 +72,14 @@ jobs:
echo "Playwright not found, skipping browser installation"
fi
continue-on-error: true

- name: Install uv
uses: astral-sh/setup-uv@v6

- name: Setup Venv
run: |
uv venv
source .venv/bin/activate
uv pip install pip
uv pip install -r build/venv-test-ipywidgets8-requirements.txt
uv pip install jupyter notebook ipykernel
30 changes: 0 additions & 30 deletions .vscode/launch.json
Original file line number Diff line number Diff line change
Expand Up @@ -126,34 +126,6 @@
"order": 10
}
},
{
"name": "Jedi LSP tests",
"type": "extensionHost",
"request": "launch",
"runtimeExecutable": "${execPath}",
"args": [
"${workspaceFolder}/src/test",
"--disable-extensions",
"--extensionDevelopmentPath=${workspaceFolder}",
"--extensionTestsPath=${workspaceFolder}/out/test"
],
"env": {
"VSC_JUPYTER_CI_TEST_GREP": "Language Server:"
},
"sourceMaps": true,
"outFiles": [
"${workspaceFolder}/out/**/*.js",
"!${workspaceFolder}/**/node_modules**/*"
],
"preLaunchTask": "preTestJediLSP",
"skipFiles": [
"<node_internals>/**"
],
"presentation": {
"group": "2_tests",
"order": 4
}
},
{
// Run this first: https://github.com/microsoft/vscode-jupyter/blob/main/src/test/datascience/setupTestEnvs.cmd
// Then specify either a grep below or mark a test as 'test.only' to run the test that's failing.
Expand Down Expand Up @@ -309,7 +281,6 @@
"--extensionTestsPath=${workspaceFolder}/out/test/index.node.js"
],
"env": {
"VSC_JUPYTER_PERF_TEST": "1",
"VSC_JUPYTER_CI_TEST_GREP": "@notebookPerformance",
"VSC_JUPYTER_CI_TEST_VSC_CHANNEL": "insiders",
"TEST_FILES_SUFFIX": "*.vscode.common.test",
Expand Down Expand Up @@ -341,7 +312,6 @@
"--extensionTestsPath=${workspaceFolder}/out/test/index.node.js"
],
"env": {
"VSC_JUPYTER_PERF_TEST": "1",
"VSC_JUPYTER_CI_TEST_GREP": "@executionPerformance",
"VSC_JUPYTER_CI_TEST_VSC_CHANNEL": "insiders",
"CI_PYTHON_PATH": "",
Expand Down
19 changes: 0 additions & 19 deletions .vscode/tasks.json
Original file line number Diff line number Diff line change
Expand Up @@ -55,25 +55,6 @@
"isDefault": true
}
},
{
"type": "npm",
"script": "preTestJediLSP",
"problemMatcher": [],
"label": "preTestJediLSP"
},
{
"type": "npm",
"script": "launchWebExtension",
"problemMatcher": [],
"label": "Launch Web Extension (Chrome)"
},
{
"type": "shell",
"problemMatcher": [],
"command": "npm",
"args": ["run", "launchWebExtension", "--", "--browser=webkit", "--port=3111"],
"label": "Launch Web Extension (Safari)"
},
{
"type": "npm",
"script": "lint",
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ Run the `watch` build Tasks from the [Run Build Task...](https://code.visualstud
You can also compile from the command-line. For a full compile you can use:

```shell
npx gulp prePublishNonBundle
npm run compile
```

For incremental builds it is recommended you use the `watch` build task (for better integration with VS Code).
Expand Down
2 changes: 1 addition & 1 deletion build/azure-pipeline.pre-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ extends:
- script: npm run clean
displayName: Clean

- script: npm run prePublishBundlePreRelease
- script: npm run build:prerelease
displayName: Build
env:
VSC_VSCE_TARGET: $(vsceTarget)
Expand Down
2 changes: 1 addition & 1 deletion build/azure-pipeline.stable.yml
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ extends:
- script: npm run clean
displayName: Clean

- script: npm run prePublishBundleStable
- script: npm run build:stable
displayName: Build
env:
VSC_VSCE_TARGET: $(vsceTarget)
Expand Down
8 changes: 0 additions & 8 deletions gulpfile.js
Original file line number Diff line number Diff line change
Expand Up @@ -200,16 +200,8 @@ gulp.task('updatePackageJsonForBundle', async () => {
}
});

gulp.task('prePublishBundle', async () => {
await spawnAsync('npm', ['run', 'prePublishBundle']);
});

gulp.task('checkDependencies', gulp.series('checkNativeDependencies', 'checkNpmDependencies'));

gulp.task('prePublishNonBundle', async () => {
await spawnAsync('npm', ['run', 'prePublishNonBundle']);
});

function spawnAsync(command, args) {
return new Promise((resolve, reject) => {
let stdOut = '';
Expand Down
Loading
Loading