Skip to content

[CONTRIB] Exclude null values no days missing metric #10517

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: develop
Choose a base branch
from

Conversation

itaise
Copy link
Contributor

@itaise itaise commented Oct 15, 2024

Hi,
Here is a PR to exclude null values from a metric - ColumnDistinctDates, which is used in a test we contributed and are actively using.
If the column checked contains null values, they are failing the test with an exception (as None does not have strftime method).
Can you please check?

Thanks a lot!

Copy link

netlify bot commented Oct 15, 2024

👷 Deploy request for niobium-lead-7998 pending review.

Visit the deploys page to approve it

Name Link
🔨 Latest commit d3f8392

@itaise
Copy link
Contributor Author

itaise commented Oct 20, 2024

Can someone please review? thanks!

Copy link
Contributor

@NathanFarmer NathanFarmer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @itaise! Sorry that this review seems to have fallen through the cracks. Based on your description this seems to be a bug. Can you please add a test to tests/integration/datasources_and_expectations/expectations that demonstrates the issue with the Expectation you described? You can follow the same pattern shown for other Expectations in that directory.

Copy link

netlify bot commented May 13, 2025

‼️ Deploy request for niobium-lead-7998 rejected.

Name Link
🔨 Latest commit 5ce1491

Copy link

codecov bot commented May 13, 2025

❌ 568 Tests Failed:

Tests completed Failed Passed Skipped
13554 568 12986 1144
View the top 3 failed test(s) by shortest run time
tests.integration.data_sources_and_expectations.expectations.test_expect_query_results_to_match_source::test_expect_query_results_to_match_source_success[redshift->redshift-both_results_are_empty]
Stack Traces | 0.003s run time
>       lambda: ihook(item=item, **kwds), when=when, reraise=reraise
    )

.../hostedtoolcache/Python/3.12.10.../x64/lib/python3.12.../site-packages/flaky/flaky_pytest_plugin.py:146: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

_batch_setup_for_datasource = <tests.integration.test_utils.data_source_config.redshift.RedshiftBatchTestSetup object at 0x7f6fa1e92ba0>
_cached_secondary_test_configs = {UUID('928cd66c-b708-4748-b4b1-d9b728a78cdb'): <tests.integration.test_utils.data_source_config.redshift.RedshiftBatch...6d4a8ba7'): <tests.integration.test_utils.data_source_config.redshift.RedshiftBatchTestSetup object at 0x7f6fa725f320>}

    @pytest.fixture
    def multi_source_batch(
        _batch_setup_for_datasource: BatchTestSetup,
        _cached_secondary_test_configs: dict[UUID, BatchTestSetup],
    ) -> Generator[MultiSourceBatch, None, None]:
        """Fixture that sets up multiple sources in a single data context."""
>       secondary_batch_setup = _cached_secondary_test_configs[_batch_setup_for_datasource.id]
E       KeyError: UUID('f84943ba-b90d-4efb-89b3-0e4ea335083d')

tests/integration/conftest.py:233: KeyError
tests.integration.data_sources_and_expectations.expectations.test_expect_query_results_to_match_source::test_expect_query_results_to_match_source_success[redshift->redshift-column_names_different_values_the_same]
Stack Traces | 0.003s run time
>       lambda: ihook(item=item, **kwds), when=when, reraise=reraise
    )

.../hostedtoolcache/Python/3.9.22.../x64/lib/python3.9.../site-packages/flaky/flaky_pytest_plugin.py:146: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

_batch_setup_for_datasource = <tests.integration.test_utils.data_source_config.redshift.RedshiftBatchTestSetup object at 0x7f82550b8f10>
_cached_secondary_test_configs = {UUID('25419823-956d-492a-b3bf-159c59d37862'): <tests.integration.test_utils.data_source_config.redshift.RedshiftBatch...999a67ba'): <tests.integration.test_utils.data_source_config.redshift.RedshiftBatchTestSetup object at 0x7f82594d54f0>}

    @pytest.fixture
    def multi_source_batch(
        _batch_setup_for_datasource: BatchTestSetup,
        _cached_secondary_test_configs: dict[UUID, BatchTestSetup],
    ) -> Generator[MultiSourceBatch, None, None]:
        """Fixture that sets up multiple sources in a single data context."""
>       secondary_batch_setup = _cached_secondary_test_configs[_batch_setup_for_datasource.id]
E       KeyError: UUID('cba5a5c1-8643-4adb-a4da-5cbaa7e730af')

tests/integration/conftest.py:233: KeyError
tests.integration.data_sources_and_expectations.expectations.test_expect_query_results_to_match_source::test_expect_query_results_to_match_source_success[redshift->redshift-column_names_different_values_the_same]
Stack Traces | 0.003s run time
>       lambda: ihook(item=item, **kwds), when=when, reraise=reraise
    )

.../hostedtoolcache/Python/3.12.10.../x64/lib/python3.12.../site-packages/flaky/flaky_pytest_plugin.py:146: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

_batch_setup_for_datasource = <tests.integration.test_utils.data_source_config.redshift.RedshiftBatchTestSetup object at 0x7f6fa1e92ba0>
_cached_secondary_test_configs = {UUID('928cd66c-b708-4748-b4b1-d9b728a78cdb'): <tests.integration.test_utils.data_source_config.redshift.RedshiftBatch...6d4a8ba7'): <tests.integration.test_utils.data_source_config.redshift.RedshiftBatchTestSetup object at 0x7f6fa725f320>}

    @pytest.fixture
    def multi_source_batch(
        _batch_setup_for_datasource: BatchTestSetup,
        _cached_secondary_test_configs: dict[UUID, BatchTestSetup],
    ) -> Generator[MultiSourceBatch, None, None]:
        """Fixture that sets up multiple sources in a single data context."""
>       secondary_batch_setup = _cached_secondary_test_configs[_batch_setup_for_datasource.id]
E       KeyError: UUID('f84943ba-b90d-4efb-89b3-0e4ea335083d')

tests/integration/conftest.py:233: KeyError

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants