Skip to content
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions ci/test_cudf_polars_polars_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,9 @@ sed -i '/PydanticDeprecatedSince212/a \ warnings.simplefilter("ignore", Depre
# Remove upper bound on aiosqlite once we support polars >1.36.1
sed -i 's/^aiosqlite/aiosqlite>=0.21.0,<0.22.0/' polars/py-polars/requirements-dev.txt

# Remove upper bound on pandas once we support 3.0.0+
sed -i 's/^pandas/pandas>=2.0,<2.4.0/' polars/py-polars/requirements-dev.txt

# Pyparsing release 3.3.0 deprecates the enablePackrat method, which is used by the
# version of pyiceberg that polars is currently pinned to. We can remove this skip
# when we move to a newer version of polars using a pyiceberg where this issue is fixed
Expand Down
1 change: 0 additions & 1 deletion conda/environments/all_cuda-129_arch-aarch64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,6 @@ dependencies:
- nvtx>=0.2.1
- openpyxl
- packaging
- pandas
- pandas>=2.0,<2.4.0
- pandoc
- polars>=1.30,<1.36
Expand Down
1 change: 0 additions & 1 deletion conda/environments/all_cuda-129_arch-x86_64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,6 @@ dependencies:
- nvtx>=0.2.1
- openpyxl
- packaging
- pandas
- pandas>=2.0,<2.4.0
- pandoc
- polars>=1.30,<1.36
Expand Down
1 change: 0 additions & 1 deletion conda/environments/all_cuda-131_arch-aarch64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,6 @@ dependencies:
- nvtx>=0.2.1
- openpyxl
- packaging
- pandas
- pandas>=2.0,<2.4.0
- pandoc
- polars>=1.30,<1.36
Expand Down
1 change: 0 additions & 1 deletion conda/environments/all_cuda-131_arch-x86_64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,6 @@ dependencies:
- nvtx>=0.2.1
- openpyxl
- packaging
- pandas
- pandas>=2.0,<2.4.0
- pandoc
- polars>=1.30,<1.36
Expand Down
6 changes: 3 additions & 3 deletions cpp/tests/rolling/rolling_test.cpp
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
/*
* SPDX-FileCopyrightText: Copyright (c) 2019-2025, NVIDIA CORPORATION.
* SPDX-FileCopyrightText: Copyright (c) 2019-2026, NVIDIA CORPORATION.
* SPDX-License-Identifier: Apache-2.0
*/

Expand Down Expand Up @@ -815,7 +815,7 @@ TYPED_TEST_SUITE(RollingTest, cudf::test::FixedWidthTypesWithoutFixedPoint);
// simple example from Pandas docs
TYPED_TEST(RollingTest, SimpleStatic)
{
// https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.rolling.html
// https://pandas.pydata.org/pandas-docs/version/2.3.3/reference/api/pandas.DataFrame.rolling.html
auto const col_data = cudf::test::make_type_param_vector<TypeParam>({0, 1, 2, 0, 4});
const std::vector<bool> col_mask = {1, 1, 1, 0, 1};

Expand Down Expand Up @@ -962,7 +962,7 @@ TYPED_TEST(RollingTest, NegativeWindowSizes)
// simple example from Pandas docs:
TYPED_TEST(RollingTest, SimpleDynamic)
{
// https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.rolling.html
// https://pandas.pydata.org/pandas-docs/version/2.3.3/reference/api/pandas.DataFrame.rolling.html
auto const col_data = cudf::test::make_type_param_vector<TypeParam>({0, 1, 2, 0, 4});
const std::vector<bool> col_mask = {1, 1, 1, 0, 1};

Expand Down
4 changes: 2 additions & 2 deletions dependencies.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -694,7 +694,7 @@ dependencies:
- output_types: [conda, requirements, pyproject]
packages:
- fsspec>=0.6.0
- pandas>=2.0,<2.4.0
- &pandas pandas>=2.0,<2.4.0
run_pylibcudf:
common:
- output_types: [conda, requirements, pyproject]
Expand Down Expand Up @@ -878,7 +878,7 @@ dependencies:
- nanoarrow
- hypothesis>=6.131.7
- *numba
- pandas
- *pandas
- output_types: conda
packages:
- python-xxhash
Expand Down
4 changes: 2 additions & 2 deletions docs/cudf/source/conf.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SPDX-FileCopyrightText: Copyright (c) 2018-2025, NVIDIA CORPORATION.
# SPDX-FileCopyrightText: Copyright (c) 2018-2026, NVIDIA CORPORATION.
# SPDX-License-Identifier: Apache-2.0
#
# cudf documentation build configuration file, created by
Expand Down Expand Up @@ -326,7 +326,7 @@ def clean_all_xml_files(path):
"nanoarrow": ("https://arrow.apache.org/nanoarrow/latest", None),
"numpy": ("https://numpy.org/doc/stable", None),
"pandas": (
"https://pandas.pydata.org/pandas-docs/stable/",
"https://pandas.pydata.org/pandas-docs/version/2.3.3/",
None,
),
"polars": ("https://docs.pola.rs/api/python/stable/", None),
Expand Down
2 changes: 1 addition & 1 deletion docs/cudf/source/cudf_pandas/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ This means that automatic conversion between GPU and CPU types and automatic fal
There are a few known limitations that you should be aware of:

- Because fallback involves copying data from GPU to CPU and back,
[value mutability](https://pandas.pydata.org/pandas-docs/stable/getting_started/overview.html#mutability-and-copying-of-data)
[value mutability](https://pandas.pydata.org/pandas-docs/version/2.3.3/getting_started/overview.html#mutability-and-copying-of-data)
of Pandas objects is not always guaranteed. You should follow the
pandas recommendation to favor immutable operations.
- For performance reasons, joins and join-based operations are not
Expand Down
6 changes: 3 additions & 3 deletions docs/cudf/source/user_guide/groupby.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ myst:
# GroupBy

cuDF supports a small (but important) subset of Pandas' [groupby
API](https://pandas.pydata.org/pandas-docs/stable/user_guide/groupby.html).
API](https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/groupby.html).

## Summary of supported operations

Expand All @@ -28,7 +28,7 @@ API](https://pandas.pydata.org/pandas-docs/stable/user_guide/groupby.html).
equivalent Pandas function. See the section on
[apply](#groupby-apply) for more details.
9. `GroupBy.pipe` similar to
[Pandas](https://pandas.pydata.org/pandas-docs/stable/user_guide/groupby.html#piping-function-calls).
[Pandas](https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/groupby.html#piping-function-calls).

## Grouping

Expand Down Expand Up @@ -271,4 +271,4 @@ a
4 4 5 9
```

[describe]: https://pandas.pydata.org/pandas-docs/stable/user_guide/groupby.html#flexible-apply
[describe]: https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/groupby.html#flexible-apply
2 changes: 1 addition & 1 deletion python/cudf/cudf/core/column/datetime.py
Original file line number Diff line number Diff line change
Expand Up @@ -383,7 +383,7 @@ def _round_dt(
],
freq: str,
) -> ColumnBase:
# https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.Timedelta.resolution_string.html
# https://pandas.pydata.org/pandas-docs/version/2.3.3/reference/api/pandas.Timedelta.resolution_string.html
old_to_new_freq_map = {
"H": "h",
"N": "ns",
Expand Down
4 changes: 2 additions & 2 deletions python/cudf/cudf/core/column_accessor.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SPDX-FileCopyrightText: Copyright (c) 2021-2025, NVIDIA CORPORATION.
# SPDX-FileCopyrightText: Copyright (c) 2021-2026, NVIDIA CORPORATION.
# SPDX-License-Identifier: Apache-2.0

from __future__ import annotations
Expand Down Expand Up @@ -61,7 +61,7 @@ def from_zip(cls, data: Iterator):
def __getitem__(self, key):
"""Recursively apply dict.__getitem__ for nested elements."""
# As described in the pandas docs
# https://pandas.pydata.org/pandas-docs/stable/user_guide/advanced.html#advanced-indexing-with-hierarchical-index
# https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/advanced.html#advanced-indexing-with-hierarchical-index
# accessing nested elements of a multiindex must be done using a tuple.
# Lists and other sequences are treated as accessing multiple elements
# at the top level of the index.
Expand Down
4 changes: 2 additions & 2 deletions python/cudf/cudf/tests/reshape/test_pivot.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SPDX-FileCopyrightText: Copyright (c) 2025, NVIDIA CORPORATION.
# SPDX-FileCopyrightText: Copyright (c) 2025-2026, NVIDIA CORPORATION.
# SPDX-License-Identifier: Apache-2.0

import pandas as pd
Expand Down Expand Up @@ -43,7 +43,7 @@ def test_pivot_simple(index, column, data):

def test_pivot_multi_values():
# from Pandas docs:
# https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.pivot.html
# https://pandas.pydata.org/pandas-docs/version/2.3.3/reference/api/pandas.DataFrame.pivot.html
pdf = pd.DataFrame(
{
"foo": ["one", "one", "one", "two", "two", "two"],
Expand Down
4 changes: 2 additions & 2 deletions python/cudf/cudf/tests/series/methods/test_replace.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SPDX-FileCopyrightText: Copyright (c) 2025, NVIDIA CORPORATION.
# SPDX-FileCopyrightText: Copyright (c) 2025-2026, NVIDIA CORPORATION.
# SPDX-License-Identifier: Apache-2.0

import re
Expand Down Expand Up @@ -189,7 +189,7 @@ def test_series_fillna_numerical(
psr = psr.copy(deep=True)
# TODO: These tests should use Pandas' nullable int type
# when we support a recent enough version of Pandas
# https://pandas.pydata.org/pandas-docs/stable/user_guide/integer_na.html
# https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/integer_na.html
if np.dtype(numeric_types_as_str).kind != "f" and psr.dtype.kind == "i":
psr = psr.astype(
cudf.utils.dtypes.np_dtypes_to_pandas_dtypes[
Expand Down
8 changes: 4 additions & 4 deletions python/cudf/cudf/utils/ioutils.py
Original file line number Diff line number Diff line change
Expand Up @@ -932,7 +932,7 @@
----------
path_or_buf : string, buffer or path object
Path to the file to open, or an open `HDFStore
<https://pandas.pydata.org/pandas-docs/stable/user_guide/io.html#hdf5-pytables>`_.
<https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/io.html#hdf5-pytables>`_.
object.
Supports any object implementing the ``__fspath__`` protocol.
This includes :class:`pathlib.Path` and py._path.local.LocalPath
Expand All @@ -943,7 +943,7 @@
mode : {'r', 'r+', 'a'}, optional
Mode to use when opening the file. Ignored if path_or_buf is a
`Pandas HDFS
<https://pandas.pydata.org/pandas-docs/stable/user_guide/io.html#hdf5-pytables>`_.
<https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/io.html#hdf5-pytables>`_.
Default is 'r'.
where : list, optional
A list of Term (or convertible) objects.
Expand Down Expand Up @@ -987,7 +987,7 @@
please use append mode and a different a key.
For more information see the `user guide
<https://pandas.pydata.org/pandas-docs/stable/user_guide/io.html#hdf5-pytables>`_.
<https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/io.html#hdf5-pytables>`_.
Parameters
----------
Expand Down Expand Up @@ -1017,7 +1017,7 @@
List of columns to create as indexed data columns for on-disk
queries, or True to use all columns. By default only the axes
of the object are indexed. See `Query via Data Columns
<https://pandas.pydata.org/pandas-docs/stable/user_guide/io.html#io-hdf5-query-data-columns>`_.
<https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/io.html#io-hdf5-query-data-columns>`_.
Applicable only to format='table'.
complevel : {0-9}, optional
Specifies a compression level for data.
Expand Down
2 changes: 1 addition & 1 deletion python/cudf/cudf/utils/queryutils.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ def visit_Name(self, node):
def query_parser(text):
"""The query expression parser.
See https://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.query.html
See https://pandas.pydata.org/pandas-docs/version/2.3.3/generated/pandas.DataFrame.query.html
* names with '@' prefix are global reference.
* other names must be column names of the dataframe.
Expand Down
4 changes: 2 additions & 2 deletions python/pylibcudf/pylibcudf/expressions.pyx
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SPDX-FileCopyrightText: Copyright (c) 2024-2025, NVIDIA CORPORATION.
# SPDX-FileCopyrightText: Copyright (c) 2024-2026, NVIDIA CORPORATION.
# SPDX-License-Identifier: Apache-2.0
import ast
import functools
Expand Down Expand Up @@ -331,7 +331,7 @@ _python_cudf_operator_map = {
# corresponding libcudf C++ AST operators.
_python_cudf_function_map = {
# TODO: Operators listed on
# https://pandas.pydata.org/pandas-docs/stable/user_guide/enhancingperf.html#expression-evaluation-via-eval # noqa: E501
# https://pandas.pydata.org/pandas-docs/version/2.3.3/user_guide/enhancingperf.html#expression-evaluation-via-eval # noqa: E501
# that we don't support yet:
# expm1, log1p, arctan2 and log10.
"isnull": ASTOperator.IS_NULL,
Expand Down
2 changes: 1 addition & 1 deletion python/pylibcudf/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ test = [
"nanoarrow",
"numba-cuda[cu13]>=0.22.2,<0.23.0",
"numba>=0.60.0,<0.62.0",
"pandas",
"pandas>=2.0,<2.4.0",
"pyarrow>=15.0.0,!=17.0.0; platform_machine=='aarch64'",
"pyarrow>=15.0.0; platform_machine=='x86_64'",
"pytest",
Expand Down
Loading