admin管理员组

文章数量:1292430

i have created the following program using the crawl4ai module and now try to compile this to an executable file for windows -

Program code which is running fine generally:

import asyncio
from crawl4ai import AsyncWebCrawler, CrawlerRunConfig, CacheMode, MemoryAdaptiveDispatcher, CrawlerMonitor, DisplayMode
from crawl4ai.async_configs import BrowserConfig, CrawlerRunConfig
import json
import pprint as pp

urls = [";,";,";,";,";]

async def crawl_batch():
    browser_config = BrowserConfig(headless=True, verbose=False)
    run_config = CrawlerRunConfig(
        cache_mode=CacheMode.BYPASS,
        stream=False  # Default: get all results at once
    )

    dispatcher = MemoryAdaptiveDispatcher(
        memory_threshold_percent=70.0,
        check_interval=1.0,
        max_session_permit=10,
        monitor=CrawlerMonitor(
            display_mode=DisplayMode.DETAILED
        )
    )

    async with AsyncWebCrawler(config=browser_config) as crawler:
        # Get all results at once
        results = await crawler.arun_many(
            urls=urls,
            config=run_config,
            dispatcher=dispatcher
        )

        resultList = []
        # Process all results after completion
        for result in results:
            if result.success:
                resultList.append(result.markdown)
                # print(result.markdown)
                # await process_result(result)
            else:  
                print(f"Failed to crawl {result.url}: {result.error_message}")

        for e in resultList:
          print(e)
          input("Press!")
        print(len(resultList))
        
if __name__ == "__main__":
    asyncio.run(crawl_batch())      

I used the following command for the executable creation:

pyinstaller --onefile  --add-data="C:/DEVNEU/.venv/crawl4ai/Lib/site-packages/playwright_stealth/js;./playwright_stealth/js" --collect-data fake_http_header.data --collect-data fake_useragent.data crawl_multi_url.py  

When i run the above code as python-file its working fine - but when i run the created executable program i get this error:

C:\DEVNEU\Python-Diverses\crawl4ai>crawl_multi_url.exe
                                                         Crawler Performance Monitor
╭──────────┬─────────────────────────────────────┬───────────┬─────────────┬───────────┬──────────┬──────────────────────────────────────────╮
│ Task ID  │ URL                                 │ Status    │ Memory (MB) │ Peak (MB) │ Duration │ Info                                     │
├──────────┼─────────────────────────────────────┼───────────┼─────────────┼───────────┼──────────┼──────────────────────────────────────────┤
│ SUMMARY  │ Total: 14                           │ Active: 0 │         2.8 │     105.6 │  0:00:12 │ ✓0 ✗14                                   │
├──────────┼─────────────────────────────────────┼───────────┼─────────────┼───────────┼──────────┼──────────────────────────────────────────┤
│ a6920beb │                  │ FAILED    │         0.3 │       0.3 │  0:00:00 │ Unexpected error in _crawl_web at line 1 │
│ 6d36514c │          │ FAILED    │         0.3 │       0.3 │  0:00:00 │ Unexpected error in _crawl_web at line 1 │
│ d089d7f2 │            │ FAILED    │         0.3 │       0.3 │  0:00:00 │ Unexpected error in _crawl_web at line 1 │
│ aa3cd0ef │            │ FAILED    │         0.3 │       0.3 │  0:00:00 │ Unexpected error in _crawl_web at line 1 │
│ 96750f07 │        │ FAILED    │         0.0 │       0.0 │  0:00:00 │ Unexpected error in _crawl_web at line 1 │
│ 1ac49dc3 │  │ FAILED    │         0.3 │       0.3 │  0:00:00 │ Unexpected error in _crawl_web at line 1 │
│ bdad503e │                   │ FAILED    │         0.3 │       0.3 │  0:00:02 │ Unexpected error in _crawl_web at line 1 │
│ 39cf4c3f │                │ FAILED    │         0.3 │       0.3 │  0:00:03 │ Unexpected error in _crawl_web at line 1 │
│ 9d5c2624 │    │ FAILED    │         0.0 │       0.0 │  0:00:03 │ Unexpected error in _crawl_web at line 1 │
│ 571ace60 │                  │ FAILED    │         0.0 │       0.0 │  0:00:03 │ Unexpected error in _crawl_web at line 1 │
│ 335ca9ce │         │ FAILED    │         0.3 │       0.3 │  0:00:04 │ Unexpected error in _crawl_web at line 1 │
│ 639a25fe │  │ FAILED    │         0.3 │       0.3 │  0:00:05 │ Unexpected error in _crawl_web at line 1 │
│ 22ddf78f │             │ FAILED    │         0.0 │       0.0 │  0:00:08 │ Unexpected error in _crawl_web at line 1 │
│ dc48d360 │               │ FAILED    │         0.3 │       0.3 │  0:00:10 │ Unexpected error in _crawl_web at line 1 │
╰──────────┴─────────────────────────────────────┴───────────┴─────────────┴───────────┴──────────┴──────────────────────────────────────────╯
Failed to crawl : Unexpected error in _crawl_web at line 1354 in _crawl_web (crawl4ai\async_crawler_strategy.py):
Error: Failed on navigating ACS-GOTO:
Page.goto: net::ERR_CONNECTION_CLOSED at /
Call log:
  - navigating to "/", waiting until "domcontentloaded"


Code context:

Failed to crawl : Unexpected error in _crawl_web at line 1354 in _crawl_web (crawl4ai\async_crawler_strategy.py):
Error: Failed on navigating ACS-GOTO:
Page.goto: net::ERR_CONNECTION_CLOSED at /
Call log:
  - navigating to "/", waiting until "domcontentloaded"


Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 12 in load_js_script (crawl4ai\js_snippet\__init__.py):
Error: Script update_image_dimensions not found in the folder C:\Users\RapidTech1898\AppData\Local\Temp\_MEI258482\crawl4ai\js_snippet

Code context:

Failed to crawl : Unexpected error in _crawl_web at line 1354 in _crawl_web (crawl4ai\async_crawler_strategy.py):
Error: Failed on navigating ACS-GOTO:
Page.goto: net::ERR_CONNECTION_CLOSED at /
Call log:
  - navigating to "/", waiting until "domcontentloaded"


Code context:

0

How can i create a working exe-file using pyinstaller for this?

本文标签: pythonPyinstaller with crawl4ai module not workingStack Overflow