Skip to content

[MAINTENANCE] Add Snowflake cleanup script #10749

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: develop
Choose a base branch
from

Conversation

tyler-hoffman
Copy link
Contributor

@tyler-hoffman tyler-hoffman commented Dec 6, 2024

WIP. I just deleted 4K schemas from snowflake using the attached script. Will snowflake tests run faster?

  • Description of PR changes above includes a link to an existing GitHub issue
  • PR title is prefixed with one of: [BUGFIX], [FEATURE], [DOCS], [MAINTENANCE], [CONTRIB]
  • Code is linted - run invoke lint (uses ruff format + ruff check)
  • Appropriate tests and docs have been updated

For more information about contributing, visit our community resources.

After you submit your PR, keep the page open and monitor the statuses of the various checks made by our continuous integration process at the bottom of the page. Please fix any issues that come up and reach out on Slack if you need help. Thanks for contributing!

Copy link

netlify bot commented Dec 6, 2024

Deploy Preview for niobium-lead-7998 canceled.

Name Link
🔨 Latest commit 14c4381
🔍 Latest deploy log https://app.netlify.com/projects/niobium-lead-7998/deploys/6837186db971920008e1a50b

Copy link

codecov bot commented Dec 6, 2024

❌ 64 Tests Failed:

Tests completed Failed Passed Skipped
17322 64 17258 2481
View the full list of 3 ❄️ flaky tests
tests.integration.data_sources_and_expectations.expectations.test_unexpected_rows_expectation::test_unexpected_rows_expectation_batch_keyword_partitioner_failure[SELECT * FROM {batch} WHERE quantity > 0 AND temperature > 74-redshift]

Flake rate in main: 39.71% (Passed 82 times, Failed 54 times)

Stack Traces | 0.163s run time
self = <sqlalchemy.engine.base.Connection object at 0x7fad3e5f4d10>
engine = Engine(redshift+psycopg2://oss:***@oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com:5439/dev?sslmode=prefer)
connection = None, _has_events = None, _allow_revalidate = True
_allow_autobegin = True

    def __init__(
        self,
        engine: Engine,
        connection: Optional[PoolProxiedConnection] = None,
        _has_events: Optional[bool] = None,
        _allow_revalidate: bool = True,
        _allow_autobegin: bool = True,
    ):
        """Construct a new Connection."""
        self.engine = engine
        self.dialect = dialect = engine.dialect
    
        if connection is None:
            try:
>               self._dbapi_connection = engine.raw_connection()

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:145: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:3297: in raw_connection
    return self.pool.connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:449: in connect
    return _ConnectionFairy._checkout(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:1264: in _checkout
    fairy = _ConnectionRecord.checkout(pool)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:713: in checkout
    rec = pool._do_get()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:179: in _do_get
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:177: in _do_get
    return self._create_connection()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:390: in _create_connection
    return _ConnectionRecord(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:675: in __init__
    self.__connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:901: in __connect
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:897: in __connect
    self.dbapi_connection = connection = pool._invoke_creator(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/create.py:646: in connect
    return dialect.connect(*cargs, **cparams)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/default.py:625: in connect
    return self.loaded_dbapi.connect(*cargs, **cparams)  # type: ignore[no-any-return]  # NOQA: E501
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

dsn = 'sslmode=prefer sslrootcert=.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/sqlalchemy_redshift/r...est-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com dbname=dev user=oss password=rc30dRs3H1ft port=5439'
connection_factory = None, cursor_factory = None
kwargs = {'dbname': 'dev', 'host': 'oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com', 'password': 'rc30dRs3H1ft', 'port': 5439, ...}
kwasync = {}

    def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs):
        """
        Create a new database connection.
    
        The connection parameters can be specified as a string:
    
            conn = psycopg2.connect("dbname=test user=postgres password=secret")
    
        or using a set of keyword arguments:
    
            conn = psycopg2.connect(database="test", user="postgres", password="secret")
    
        Or as a mix of both. The basic connection parameters are:
    
        - *dbname*: the database name
        - *database*: the database name (only as keyword argument)
        - *user*: user name used to authenticate
        - *password*: password used to authenticate
        - *host*: database host address (defaults to UNIX socket if not provided)
        - *port*: connection port number (defaults to 5432 if not provided)
    
        Using the *connection_factory* parameter a different class or connections
        factory can be specified. It should be a callable object taking a dsn
        argument.
    
        Using the *cursor_factory* parameter, a new default cursor factory will be
        used by cursor().
    
        Using *async*=True an asynchronous connection will be created. *async_* is
        a valid alias (for Python versions where ``async`` is a keyword).
    
        Any other keyword parameter will be passed to the underlying client
        library: the list of supported parameters depends on the library version.
    
        """
        kwasync = {}
        if 'async' in kwargs:
            kwasync['async'] = kwargs.pop('async')
        if 'async_' in kwargs:
            kwasync['async_'] = kwargs.pop('async_')
    
        dsn = _ext.make_dsn(dsn, **kwargs)
>       conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
E       psycopg2.OperationalError: connection to server at "oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com" (44.198.206.69), port 5439 failed: FATAL:  connection limit "500" exceeded for non-bootstrap users

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/psycopg2/__init__.py:122: OperationalError

The above exception was the direct cause of the following exception:

self = RedshiftDatasource(type='redshift', name='bfsiumxuon', id=None, assets=[], connection_string=RedshiftDsn('redshift+psy...naws.com:5439/dev?sslmode=prefer', scheme='redshift+psycopg2', host_type='domain'), create_temp_table=False, kwargs={})
test_assets = True

    @override
    def test_connection(self, test_assets: bool = True) -> None:
        """Test the connection for the SQLDatasource.
    
        Args:
            test_assets: If assets have been passed to the SQLDatasource, whether to test them as well.
    
        Raises:
            TestConnectionError: If the connection test fails.
        """  # noqa: E501 # FIXME CoP
        try:
            engine: sqlalchemy.Engine = self.get_engine()
>           engine.connect()

.../datasource/fluent/sql_datasource.py:1275: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:3273: in connect
    return self._connection_cls(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:147: in __init__
    Connection._handle_dbapi_exception_noconnection(
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:2436: in _handle_dbapi_exception_noconnection
    raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:145: in __init__
    self._dbapi_connection = engine.raw_connection()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:3297: in raw_connection
    return self.pool.connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:449: in connect
    return _ConnectionFairy._checkout(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:1264: in _checkout
    fairy = _ConnectionRecord.checkout(pool)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:713: in checkout
    rec = pool._do_get()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:179: in _do_get
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:177: in _do_get
    return self._create_connection()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:390: in _create_connection
    return _ConnectionRecord(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:675: in __init__
    self.__connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:901: in __connect
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:897: in __connect
    self.dbapi_connection = connection = pool._invoke_creator(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/create.py:646: in connect
    return dialect.connect(*cargs, **cparams)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/default.py:625: in connect
    return self.loaded_dbapi.connect(*cargs, **cparams)  # type: ignore[no-any-return]  # NOQA: E501
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

dsn = 'sslmode=prefer sslrootcert=.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/sqlalchemy_redshift/r...est-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com dbname=dev user=oss password=rc30dRs3H1ft port=5439'
connection_factory = None, cursor_factory = None
kwargs = {'dbname': 'dev', 'host': 'oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com', 'password': 'rc30dRs3H1ft', 'port': 5439, ...}
kwasync = {}

    def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs):
        """
        Create a new database connection.
    
        The connection parameters can be specified as a string:
    
            conn = psycopg2.connect("dbname=test user=postgres password=secret")
    
        or using a set of keyword arguments:
    
            conn = psycopg2.connect(database="test", user="postgres", password="secret")
    
        Or as a mix of both. The basic connection parameters are:
    
        - *dbname*: the database name
        - *database*: the database name (only as keyword argument)
        - *user*: user name used to authenticate
        - *password*: password used to authenticate
        - *host*: database host address (defaults to UNIX socket if not provided)
        - *port*: connection port number (defaults to 5432 if not provided)
    
        Using the *connection_factory* parameter a different class or connections
        factory can be specified. It should be a callable object taking a dsn
        argument.
    
        Using the *cursor_factory* parameter, a new default cursor factory will be
        used by cursor().
    
        Using *async*=True an asynchronous connection will be created. *async_* is
        a valid alias (for Python versions where ``async`` is a keyword).
    
        Any other keyword parameter will be passed to the underlying client
        library: the list of supported parameters depends on the library version.
    
        """
        kwasync = {}
        if 'async' in kwargs:
            kwasync['async'] = kwargs.pop('async')
        if 'async_' in kwargs:
            kwasync['async_'] = kwargs.pop('async_')
    
        dsn = _ext.make_dsn(dsn, **kwargs)
>       conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
E       sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com" (44.198.206.69), port 5439 failed: FATAL:  connection limit "500" exceeded for non-bootstrap users
E       
E       (Background on this error at: https://sqlalche..../e/20/e3q8)

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/psycopg2/__init__.py:122: OperationalError

The above exception was the direct cause of the following exception:

>       lambda: ihook(item=item, **kwds), when=when, reraise=reraise
    )

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../site-packages/flaky/flaky_pytest_plugin.py:146: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tests/integration/conftest.py:217: in asset_for_datasource
    yield _batch_setup_for_datasource.make_asset()
.../test_utils/data_source_config/redshift.py:91: in make_asset
    return self.context.data_sources.add_redshift(
.../datasource/fluent/sources.py:476: in add_datasource
    datasource.test_connection()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = RedshiftDatasource(type='redshift', name='bfsiumxuon', id=None, assets=[], connection_string=RedshiftDsn('redshift+psy...naws.com:5439/dev?sslmode=prefer', scheme='redshift+psycopg2', host_type='domain'), create_temp_table=False, kwargs={})
test_assets = True

    @override
    def test_connection(self, test_assets: bool = True) -> None:
        """Test the connection for the SQLDatasource.
    
        Args:
            test_assets: If assets have been passed to the SQLDatasource, whether to test them as well.
    
        Raises:
            TestConnectionError: If the connection test fails.
        """  # noqa: E501 # FIXME CoP
        try:
            engine: sqlalchemy.Engine = self.get_engine()
            engine.connect()
        except Exception as e:
>           raise TestConnectionError(cause=e) from e
E           great_expectations.datasource.fluent.interfaces.TestConnectionError: Attempt to connect to datasource failed: due to OperationalError('(psycopg2.OperationalError) connection to server at "oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com" (44.198.206.69), port 5439 failed: FATAL:  connection limit "500" exceeded for non-bootstrap users\n')

.../datasource/fluent/sql_datasource.py:1277: TestConnectionError
tests.integration.data_sources_and_expectations.expectations.test_unexpected_rows_expectation::test_unexpected_rows_expectation_join_keyword_failure[redshift]

Flake rate in main: 38.13% (Passed 86 times, Failed 53 times)

Stack Traces | 0.163s run time
self = <sqlalchemy.engine.base.Connection object at 0x7fad3e7912e0>
engine = Engine(redshift+psycopg2://oss:***@oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com:5439/dev?sslmode=prefer)
connection = None, _has_events = None, _allow_revalidate = True
_allow_autobegin = True

    def __init__(
        self,
        engine: Engine,
        connection: Optional[PoolProxiedConnection] = None,
        _has_events: Optional[bool] = None,
        _allow_revalidate: bool = True,
        _allow_autobegin: bool = True,
    ):
        """Construct a new Connection."""
        self.engine = engine
        self.dialect = dialect = engine.dialect
    
        if connection is None:
            try:
>               self._dbapi_connection = engine.raw_connection()

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:145: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:3297: in raw_connection
    return self.pool.connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:449: in connect
    return _ConnectionFairy._checkout(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:1264: in _checkout
    fairy = _ConnectionRecord.checkout(pool)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:713: in checkout
    rec = pool._do_get()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:179: in _do_get
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:177: in _do_get
    return self._create_connection()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:390: in _create_connection
    return _ConnectionRecord(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:675: in __init__
    self.__connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:901: in __connect
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:897: in __connect
    self.dbapi_connection = connection = pool._invoke_creator(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/create.py:646: in connect
    return dialect.connect(*cargs, **cparams)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/default.py:625: in connect
    return self.loaded_dbapi.connect(*cargs, **cparams)  # type: ignore[no-any-return]  # NOQA: E501
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

dsn = 'sslmode=prefer sslrootcert=.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/sqlalchemy_redshift/r...est-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com dbname=dev user=oss password=rc30dRs3H1ft port=5439'
connection_factory = None, cursor_factory = None
kwargs = {'dbname': 'dev', 'host': 'oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com', 'password': 'rc30dRs3H1ft', 'port': 5439, ...}
kwasync = {}

    def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs):
        """
        Create a new database connection.
    
        The connection parameters can be specified as a string:
    
            conn = psycopg2.connect("dbname=test user=postgres password=secret")
    
        or using a set of keyword arguments:
    
            conn = psycopg2.connect(database="test", user="postgres", password="secret")
    
        Or as a mix of both. The basic connection parameters are:
    
        - *dbname*: the database name
        - *database*: the database name (only as keyword argument)
        - *user*: user name used to authenticate
        - *password*: password used to authenticate
        - *host*: database host address (defaults to UNIX socket if not provided)
        - *port*: connection port number (defaults to 5432 if not provided)
    
        Using the *connection_factory* parameter a different class or connections
        factory can be specified. It should be a callable object taking a dsn
        argument.
    
        Using the *cursor_factory* parameter, a new default cursor factory will be
        used by cursor().
    
        Using *async*=True an asynchronous connection will be created. *async_* is
        a valid alias (for Python versions where ``async`` is a keyword).
    
        Any other keyword parameter will be passed to the underlying client
        library: the list of supported parameters depends on the library version.
    
        """
        kwasync = {}
        if 'async' in kwargs:
            kwasync['async'] = kwargs.pop('async')
        if 'async_' in kwargs:
            kwasync['async_'] = kwargs.pop('async_')
    
        dsn = _ext.make_dsn(dsn, **kwargs)
>       conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
E       psycopg2.OperationalError: connection to server at "oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com" (44.198.206.69), port 5439 failed: FATAL:  connection limit "500" exceeded for non-bootstrap users

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/psycopg2/__init__.py:122: OperationalError

The above exception was the direct cause of the following exception:

self = RedshiftDatasource(type='redshift', name='eumcrjrmxp', id=None, assets=[], connection_string=RedshiftDsn('redshift+psy...naws.com:5439/dev?sslmode=prefer', scheme='redshift+psycopg2', host_type='domain'), create_temp_table=False, kwargs={})
test_assets = True

    @override
    def test_connection(self, test_assets: bool = True) -> None:
        """Test the connection for the SQLDatasource.
    
        Args:
            test_assets: If assets have been passed to the SQLDatasource, whether to test them as well.
    
        Raises:
            TestConnectionError: If the connection test fails.
        """  # noqa: E501 # FIXME CoP
        try:
            engine: sqlalchemy.Engine = self.get_engine()
>           engine.connect()

.../datasource/fluent/sql_datasource.py:1275: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:3273: in connect
    return self._connection_cls(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:147: in __init__
    Connection._handle_dbapi_exception_noconnection(
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:2436: in _handle_dbapi_exception_noconnection
    raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:145: in __init__
    self._dbapi_connection = engine.raw_connection()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:3297: in raw_connection
    return self.pool.connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:449: in connect
    return _ConnectionFairy._checkout(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:1264: in _checkout
    fairy = _ConnectionRecord.checkout(pool)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:713: in checkout
    rec = pool._do_get()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:179: in _do_get
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:177: in _do_get
    return self._create_connection()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:390: in _create_connection
    return _ConnectionRecord(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:675: in __init__
    self.__connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:901: in __connect
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:897: in __connect
    self.dbapi_connection = connection = pool._invoke_creator(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/create.py:646: in connect
    return dialect.connect(*cargs, **cparams)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/default.py:625: in connect
    return self.loaded_dbapi.connect(*cargs, **cparams)  # type: ignore[no-any-return]  # NOQA: E501
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

dsn = 'sslmode=prefer sslrootcert=.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/sqlalchemy_redshift/r...est-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com dbname=dev user=oss password=rc30dRs3H1ft port=5439'
connection_factory = None, cursor_factory = None
kwargs = {'dbname': 'dev', 'host': 'oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com', 'password': 'rc30dRs3H1ft', 'port': 5439, ...}
kwasync = {}

    def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs):
        """
        Create a new database connection.
    
        The connection parameters can be specified as a string:
    
            conn = psycopg2.connect("dbname=test user=postgres password=secret")
    
        or using a set of keyword arguments:
    
            conn = psycopg2.connect(database="test", user="postgres", password="secret")
    
        Or as a mix of both. The basic connection parameters are:
    
        - *dbname*: the database name
        - *database*: the database name (only as keyword argument)
        - *user*: user name used to authenticate
        - *password*: password used to authenticate
        - *host*: database host address (defaults to UNIX socket if not provided)
        - *port*: connection port number (defaults to 5432 if not provided)
    
        Using the *connection_factory* parameter a different class or connections
        factory can be specified. It should be a callable object taking a dsn
        argument.
    
        Using the *cursor_factory* parameter, a new default cursor factory will be
        used by cursor().
    
        Using *async*=True an asynchronous connection will be created. *async_* is
        a valid alias (for Python versions where ``async`` is a keyword).
    
        Any other keyword parameter will be passed to the underlying client
        library: the list of supported parameters depends on the library version.
    
        """
        kwasync = {}
        if 'async' in kwargs:
            kwasync['async'] = kwargs.pop('async')
        if 'async_' in kwargs:
            kwasync['async_'] = kwargs.pop('async_')
    
        dsn = _ext.make_dsn(dsn, **kwargs)
>       conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
E       sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com" (44.198.206.69), port 5439 failed: FATAL:  connection limit "500" exceeded for non-bootstrap users
E       
E       (Background on this error at: https://sqlalche..../e/20/e3q8)

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/psycopg2/__init__.py:122: OperationalError

The above exception was the direct cause of the following exception:

>       lambda: ihook(item=item, **kwds), when=when, reraise=reraise
    )

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../site-packages/flaky/flaky_pytest_plugin.py:146: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tests/integration/conftest.py:206: in batch_for_datasource
    yield _batch_setup_for_datasource.make_batch()
.../test_utils/data_source_config/sql.py:88: in make_batch
    self.make_asset()
.../test_utils/data_source_config/redshift.py:91: in make_asset
    return self.context.data_sources.add_redshift(
.../datasource/fluent/sources.py:476: in add_datasource
    datasource.test_connection()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = RedshiftDatasource(type='redshift', name='eumcrjrmxp', id=None, assets=[], connection_string=RedshiftDsn('redshift+psy...naws.com:5439/dev?sslmode=prefer', scheme='redshift+psycopg2', host_type='domain'), create_temp_table=False, kwargs={})
test_assets = True

    @override
    def test_connection(self, test_assets: bool = True) -> None:
        """Test the connection for the SQLDatasource.
    
        Args:
            test_assets: If assets have been passed to the SQLDatasource, whether to test them as well.
    
        Raises:
            TestConnectionError: If the connection test fails.
        """  # noqa: E501 # FIXME CoP
        try:
            engine: sqlalchemy.Engine = self.get_engine()
            engine.connect()
        except Exception as e:
>           raise TestConnectionError(cause=e) from e
E           great_expectations.datasource.fluent.interfaces.TestConnectionError: Attempt to connect to datasource failed: due to OperationalError('(psycopg2.OperationalError) connection to server at "oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com" (44.198.206.69), port 5439 failed: FATAL:  connection limit "500" exceeded for non-bootstrap users\n')

.../datasource/fluent/sql_datasource.py:1277: TestConnectionError
tests.integration.data_sources_and_expectations.expectations.test_unexpected_rows_expectation::test_unexpected_rows_expectation_join_keyword_partitioner_failure[redshift]

Flake rate in main: 40.29% (Passed 83 times, Failed 56 times)

Stack Traces | 0.164s run time
self = <sqlalchemy.engine.base.Connection object at 0x7fad3e5768d0>
engine = Engine(redshift+psycopg2://oss:***@oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com:5439/dev?sslmode=prefer)
connection = None, _has_events = None, _allow_revalidate = True
_allow_autobegin = True

    def __init__(
        self,
        engine: Engine,
        connection: Optional[PoolProxiedConnection] = None,
        _has_events: Optional[bool] = None,
        _allow_revalidate: bool = True,
        _allow_autobegin: bool = True,
    ):
        """Construct a new Connection."""
        self.engine = engine
        self.dialect = dialect = engine.dialect
    
        if connection is None:
            try:
>               self._dbapi_connection = engine.raw_connection()

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:145: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:3297: in raw_connection
    return self.pool.connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:449: in connect
    return _ConnectionFairy._checkout(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:1264: in _checkout
    fairy = _ConnectionRecord.checkout(pool)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:713: in checkout
    rec = pool._do_get()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:179: in _do_get
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:177: in _do_get
    return self._create_connection()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:390: in _create_connection
    return _ConnectionRecord(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:675: in __init__
    self.__connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:901: in __connect
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:897: in __connect
    self.dbapi_connection = connection = pool._invoke_creator(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/create.py:646: in connect
    return dialect.connect(*cargs, **cparams)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/default.py:625: in connect
    return self.loaded_dbapi.connect(*cargs, **cparams)  # type: ignore[no-any-return]  # NOQA: E501
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

dsn = 'sslmode=prefer sslrootcert=.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/sqlalchemy_redshift/r...est-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com dbname=dev user=oss password=rc30dRs3H1ft port=5439'
connection_factory = None, cursor_factory = None
kwargs = {'dbname': 'dev', 'host': 'oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com', 'password': 'rc30dRs3H1ft', 'port': 5439, ...}
kwasync = {}

    def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs):
        """
        Create a new database connection.
    
        The connection parameters can be specified as a string:
    
            conn = psycopg2.connect("dbname=test user=postgres password=secret")
    
        or using a set of keyword arguments:
    
            conn = psycopg2.connect(database="test", user="postgres", password="secret")
    
        Or as a mix of both. The basic connection parameters are:
    
        - *dbname*: the database name
        - *database*: the database name (only as keyword argument)
        - *user*: user name used to authenticate
        - *password*: password used to authenticate
        - *host*: database host address (defaults to UNIX socket if not provided)
        - *port*: connection port number (defaults to 5432 if not provided)
    
        Using the *connection_factory* parameter a different class or connections
        factory can be specified. It should be a callable object taking a dsn
        argument.
    
        Using the *cursor_factory* parameter, a new default cursor factory will be
        used by cursor().
    
        Using *async*=True an asynchronous connection will be created. *async_* is
        a valid alias (for Python versions where ``async`` is a keyword).
    
        Any other keyword parameter will be passed to the underlying client
        library: the list of supported parameters depends on the library version.
    
        """
        kwasync = {}
        if 'async' in kwargs:
            kwasync['async'] = kwargs.pop('async')
        if 'async_' in kwargs:
            kwasync['async_'] = kwargs.pop('async_')
    
        dsn = _ext.make_dsn(dsn, **kwargs)
>       conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
E       psycopg2.OperationalError: connection to server at "oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com" (44.198.206.69), port 5439 failed: FATAL:  connection limit "500" exceeded for non-bootstrap users

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/psycopg2/__init__.py:122: OperationalError

The above exception was the direct cause of the following exception:

self = RedshiftDatasource(type='redshift', name='eeleehnqwt', id=None, assets=[], connection_string=RedshiftDsn('redshift+psy...naws.com:5439/dev?sslmode=prefer', scheme='redshift+psycopg2', host_type='domain'), create_temp_table=False, kwargs={})
test_assets = True

    @override
    def test_connection(self, test_assets: bool = True) -> None:
        """Test the connection for the SQLDatasource.
    
        Args:
            test_assets: If assets have been passed to the SQLDatasource, whether to test them as well.
    
        Raises:
            TestConnectionError: If the connection test fails.
        """  # noqa: E501 # FIXME CoP
        try:
            engine: sqlalchemy.Engine = self.get_engine()
>           engine.connect()

.../datasource/fluent/sql_datasource.py:1275: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:3273: in connect
    return self._connection_cls(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:147: in __init__
    Connection._handle_dbapi_exception_noconnection(
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:2436: in _handle_dbapi_exception_noconnection
    raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:145: in __init__
    self._dbapi_connection = engine.raw_connection()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/base.py:3297: in raw_connection
    return self.pool.connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:449: in connect
    return _ConnectionFairy._checkout(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:1264: in _checkout
    fairy = _ConnectionRecord.checkout(pool)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:713: in checkout
    rec = pool._do_get()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:179: in _do_get
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/impl.py:177: in _do_get
    return self._create_connection()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:390: in _create_connection
    return _ConnectionRecord(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:675: in __init__
    self.__connect()
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:901: in __connect
    with util.safe_reraise():
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/util/langhelpers.py:224: in __exit__
    raise exc_value.with_traceback(exc_tb)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/pool/base.py:897: in __connect
    self.dbapi_connection = connection = pool._invoke_creator(self)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/create.py:646: in connect
    return dialect.connect(*cargs, **cparams)
.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../sqlalchemy/engine/default.py:625: in connect
    return self.loaded_dbapi.connect(*cargs, **cparams)  # type: ignore[no-any-return]  # NOQA: E501
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

dsn = 'sslmode=prefer sslrootcert=.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/sqlalchemy_redshift/r...est-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com dbname=dev user=oss password=rc30dRs3H1ft port=5439'
connection_factory = None, cursor_factory = None
kwargs = {'dbname': 'dev', 'host': 'oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com', 'password': 'rc30dRs3H1ft', 'port': 5439, ...}
kwasync = {}

    def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs):
        """
        Create a new database connection.
    
        The connection parameters can be specified as a string:
    
            conn = psycopg2.connect("dbname=test user=postgres password=secret")
    
        or using a set of keyword arguments:
    
            conn = psycopg2.connect(database="test", user="postgres", password="secret")
    
        Or as a mix of both. The basic connection parameters are:
    
        - *dbname*: the database name
        - *database*: the database name (only as keyword argument)
        - *user*: user name used to authenticate
        - *password*: password used to authenticate
        - *host*: database host address (defaults to UNIX socket if not provided)
        - *port*: connection port number (defaults to 5432 if not provided)
    
        Using the *connection_factory* parameter a different class or connections
        factory can be specified. It should be a callable object taking a dsn
        argument.
    
        Using the *cursor_factory* parameter, a new default cursor factory will be
        used by cursor().
    
        Using *async*=True an asynchronous connection will be created. *async_* is
        a valid alias (for Python versions where ``async`` is a keyword).
    
        Any other keyword parameter will be passed to the underlying client
        library: the list of supported parameters depends on the library version.
    
        """
        kwasync = {}
        if 'async' in kwargs:
            kwasync['async'] = kwargs.pop('async')
        if 'async_' in kwargs:
            kwasync['async_'] = kwargs.pop('async_')
    
        dsn = _ext.make_dsn(dsn, **kwargs)
>       conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
E       sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com" (44.198.206.69), port 5439 failed: FATAL:  connection limit "500" exceeded for non-bootstrap users
E       
E       (Background on this error at: https://sqlalche..../e/20/e3q8)

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12....../site-packages/psycopg2/__init__.py:122: OperationalError

The above exception was the direct cause of the following exception:

>       lambda: ihook(item=item, **kwds), when=when, reraise=reraise
    )

.../hostedtoolcache/Python/3.12.10................................................................................................................../x64/lib/python3.12.../site-packages/flaky/flaky_pytest_plugin.py:146: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tests/integration/conftest.py:217: in asset_for_datasource
    yield _batch_setup_for_datasource.make_asset()
.../test_utils/data_source_config/redshift.py:91: in make_asset
    return self.context.data_sources.add_redshift(
.../datasource/fluent/sources.py:476: in add_datasource
    datasource.test_connection()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = RedshiftDatasource(type='redshift', name='eeleehnqwt', id=None, assets=[], connection_string=RedshiftDsn('redshift+psy...naws.com:5439/dev?sslmode=prefer', scheme='redshift+psycopg2', host_type='domain'), create_temp_table=False, kwargs={})
test_assets = True

    @override
    def test_connection(self, test_assets: bool = True) -> None:
        """Test the connection for the SQLDatasource.
    
        Args:
            test_assets: If assets have been passed to the SQLDatasource, whether to test them as well.
    
        Raises:
            TestConnectionError: If the connection test fails.
        """  # noqa: E501 # FIXME CoP
        try:
            engine: sqlalchemy.Engine = self.get_engine()
            engine.connect()
        except Exception as e:
>           raise TestConnectionError(cause=e) from e
E           great_expectations.datasource.fluent.interfaces.TestConnectionError: Attempt to connect to datasource failed: due to OperationalError('(psycopg2.OperationalError) connection to server at "oss-test-redshift-cluster.crz6wpzfh5px.us-east-1.redshift.amazonaws.com" (44.198.206.69), port 5439 failed: FATAL:  connection limit "500" exceeded for non-bootstrap users\n')

.../datasource/fluent/sql_datasource.py:1277: TestConnectionError

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

@tyler-hoffman tyler-hoffman requested a review from a team December 10, 2024 11:15
@tyler-hoffman tyler-hoffman changed the title Add Snowflake cleanup script [MAINTENANCE] Add Snowflake cleanup script May 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant