Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ssl-certificate): get ssl certificate support proxy #961

Open
wants to merge 1 commit into
base: next
Choose a base branch
from

Conversation

wakaka6
Copy link

@wakaka6 wakaka6 commented Apr 9, 2025

Summary

Support proxy when getting ssl certificate

Same as #864 PR, but based on next branches.

import asyncio
from crawl4ai import (
    AsyncWebCrawler,
    BrowserConfig,
    CrawlerRunConfig,
    CacheMode,
    DefaultMarkdownGenerator,
    CrawlResult,
)
from crawl4ai.async_configs import ProxyConfig


async def main():
    browser_config = BrowserConfig(headless=True, verbose=True)
    async with AsyncWebCrawler(config=browser_config) as crawler:
        crawler_config = CrawlerRunConfig(
            cache_mode=CacheMode.BYPASS,
            magic=True,
            fetch_ssl_certificate=True,
            proxy_config=ProxyConfig(server="socks5://127.0.0.1:1088"),
            markdown_generator=DefaultMarkdownGenerator(
                # content_filter=PruningContentFilter(
                #     threshold=0.48, threshold_type="fixed", min_word_threshold=0
                # )
            ),
        )
        result : CrawlResult = await crawler.arun(
            url="https://www.google.com", config=crawler_config
        )
        print("ssl:", result.ssl_certificate)
        print("markdown: ",result.markdown[:500])


if __name__ == "__main__":
    asyncio.run(main())
In [1]: from crawl4ai.ssl_certificate import SSLCertificate

In [2]: from crawl4ai.async_configs import ProxyConfig

In [3]: SSLCertificate.from_url(url="https://www.google.com", proxy_config=ProxyConfi
   ...: g("socks5://127.0.0.1:1088"), verify_ssl=False)
Out[3]: (<crawl4ai.ssl_certificate.SSLCertificate at 0x7681425650a0>, None)

In [4]: cert, err = SSLCertificate.from_url(url="https://www.google.com", proxy_confi
   ...: g=ProxyConfig("socks5://127.0.0.1:1088"), verify_ssl=False)

In [5]: cert
Out[5]: <crawl4ai.ssl_certificate.SSLCertificate at 0x7681819005f0>

In [6]: cert.subject
Out[6]: {'CN': 'www.google.com'}

In [7]: cert.to_playwright_format()
Out[7]:
{'issuer': 'WR2',
 'subject': 'www.google.com',
 'valid_from': 1742440831,
 'valid_until': 1749698430}

image

List of files changed and why

ssl_ceritficate.py

  • Support proxy when getting ssl certificate
  • Support export certificate to playwright format with ssl_ceritificate.to_playwright_format()
  • Support str(ssl_ceritificate)

proxy_config.py

  • Support for conversion of URLs with embedded credentials to ProxyConfig. The user and password in the URL with embedded credentials overrides self.username and self.password.
  • e.g.
    ProxyConfig(server="http://user:pass@proxy-server:1080",username="", password="")
    --(normalize)--> ProxyConfig(server="http://proxy-server:1080",username="user", password="pass")
    

async_crawler_strategy.py

  • Crawling will set the proxy according to the configuration.

How Has This Been Tested?

  • In the environment of network limitation, use http, https and socks5 proxy to test the website which is banned by firewall(like GFW), all of them can get SSL certificate(e.g. you can't access google directly in China, you need external proxy).
  • In the environment where there is no network restriction, you can also get the certificate without using proxy.

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • I have added/updated unit tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

@wakaka6
Copy link
Author

wakaka6 commented Apr 9, 2025

here is test script
ssl-proxy-test.md

Copy link
Collaborator

@aravindkarnam aravindkarnam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wakaka6 @unclecode I'm not very familiar with SSL certificate related concepts or PySocks library. So I'm not in a position to properly review this right away, but I'll do by next alpha release.

I just had a question w.r.t the dependancy, please check that out.

@@ -43,6 +43,7 @@ dependencies = [
"faust-cchardet>=2.1.19",
"aiohttp>=3.11.11",
"humanize>=4.10.0",
"PySocks @ git+https://github.com/amirasaran/PySocks.git@3da955fd212ce02c3ab3bc166b5bfac3c91b4019"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wakaka6 Could you explain why you're using a direct Git reference with a specific commit hash instead of the official PyPI package?

faust-cchardet>=2.1.19
PySocks @ git+https://github.com/amirasaran/PySocks.git@3da955fd212ce02c3ab3bc166b5bfac3c91b4019
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you explain why you're using a direct Git reference with a specific commit hash instead of the official PyPI package?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The official Pysocks has buggy(Anorov/PySocks#147) support for http proxies and the original author doesn't seem to have maintained it for a long time as well, so i need to use a specific version of Pysocks as a dependency.

@aravindkarnam aravindkarnam added this to the APR-Bug fixes milestone Apr 12, 2025
@aravindkarnam
Copy link
Collaborator

Fixes: #778 - Tagging for visibility

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants