Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Improve validation errors in Crawlee CLI #1140

Merged
merged 2 commits into from
Apr 7, 2025
Merged

Conversation

vdusek
Copy link
Collaborator

@vdusek vdusek commented Apr 7, 2025

Description

  • Add checks to ensure that the selected package manager is installed and executable.
  • Suppress cookie-cutter logs when a failure occurs.

Issues

Testing

  • Output before:
- Before:

```$ uvx crawlee[cli] create --crawler-type parsel --http-client httpx --package-manager poetry --start-url 'https://crawlee.dev/' --no-apify my-crawler
⠋ Bootstrapping...Traceback (most recent call last):
  File "/tmp/tmpd9wekhuy.py", line 9, in <module>
    version = subprocess.check_output([manager, '--version']).decode().strip()
              ~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.13/subprocess.py", line 474, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
           ~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
               **kwargs).stdout
               ^^^^^^^^^
  File "/usr/lib64/python3.13/subprocess.py", line 556, in run
    with Popen(*popenargs, **kwargs) as process:
         ~~~~~^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.13/subprocess.py", line 1038, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
    ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                        pass_fds, cwd, env,
                        ^^^^^^^^^^^^^^^^^^^
    ...<5 lines>...
                        gid, gids, uid, umask,
                        ^^^^^^^^^^^^^^^^^^^^^^
                        start_new_session, process_group)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.13/subprocess.py", line 1974, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'poetry'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/tmp/tmpd9wekhuy.py", line 11, in <module>
    raise RuntimeError(f'You chose to use the {manager_text} package manager, but it does not seem to be installed') from exc
RuntimeError: You chose to use the Poetry package manager, but it does not seem to be installed
Stopping generation because pre_gen_project hook script didn't exit successfully
╭────────────────────────────────────────────────────── Traceback (most recent call last) ───────────────────────────────────────────────────────╮
│ /home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/_cli.py:203 in create                           │
│                                                                                                                                                │
│   200 │   │   │   │   transient=True,                                                                                                          │
│   201 │   │   │   ) as progress:                                                                                                               │
│   202 │   │   │   │   progress.add_task(description='Bootstrapping...', total=None)                                                            │
│ ❱ 203 │   │   │   │   cookiecutter(                                                                                                            │
│   204 │   │   │   │   │   template=str(template_directory),                                                                                    │
│   205 │   │   │   │   │   no_input=True,                                                                                                       │
│   206 │   │   │   │   │   extra_context={                                                                                                      │
│                                                                                                                                                │
│ ╭─────────────────────────────────── locals ───────────────────────────────────╮                                                               │
│ │             crawler_type = 'parsel'                                          │                                                               │
│ │ enable_apify_integration = False                                             │                                                               │
│ │              http_client = 'httpx'                                           │                                                               │
│ │          package_manager = 'poetry'                                          │                                                               │
│ │             package_name = 'my_crawler'                                      │                                                               │
│ │                 progress = <rich.progress.Progress object at 0x7f25d34ec2f0> │                                                               │
│ │             project_name = 'my-crawler'                                      │                                                               │
│ │                start_url = 'https://crawlee.dev/'                            │                                                               │
│ ╰──────────────────────────────────────────────────────────────────────────────╯                                                               │
│                                                                                                                                                │
│ /home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/cookiecutter/main.py:182 in cookiecutter                │
│                                                                                                                                                │
│   179 │                                                                                                                                        │
│   180 │   # Create project from local context and project template.                                                                            │
│   181 │   with import_patch:                                                                                                                   │
│ ❱ 182 │   │   result = generate_files(                                                                                                         │
│   183 │   │   │   repo_dir=repo_dir,                                                                                                           │
│   184 │   │   │   context=context,                                                                                                             │
│   185 │   │   │   overwrite_if_exists=overwrite_if_exists,                                                                                     │
│                                                                                                                                                │
│ ╭────────────────────────────────────────────────────────────────── locals ──────────────────────────────────────────────────────────────────╮ │
│ │            accept_hooks = True                                                                                                             │ │
│ │           base_repo_dir = '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-pa'+31                            │ │
│ │                checkout = None                                                                                                             │ │
│ │                 cleanup = False                                                                                                            │ │
│ │   cleanup_base_repo_dir = False                                                                                                            │ │
│ │             config_dict = {                                                                                                                │ │
│ │                           │   'cookiecutters_dir': '/home/vdusek/.cookiecutters/',                                                         │ │
│ │                           │   'replay_dir': '/home/vdusek/.cookiecutter_replay/',                                                          │ │
│ │                           │   'default_context': OrderedDict(),                                                                            │ │
│ │                           │   'abbreviations': {                                                                                           │ │
│ │                           │   │   'gh': 'https://github.com/{0}.git',                                                                      │ │
│ │                           │   │   'gl': 'https://gitlab.com/{0}.git',                                                                      │ │
│ │                           │   │   'bb': 'https://bitbucket.org/{0}'                                                                        │ │
│ │                           │   }                                                                                                            │ │
│ │                           }                                                                                                                │ │
│ │             config_file = None                                                                                                             │ │
│ │                 context = OrderedDict({'cookiecutter': OrderedDict({'project_name': 'my-crawler', '__package_name': 'my_crawler',          │ │
│ │                           'crawler_type': 'parsel', '__crawler_type': 'parsel', 'http_client': 'httpx', 'package_manager': 'poetry',       │ │
│ │                           'enable_apify_integration': False, 'start_url': 'https://crawlee.dev/', '_jinja2_env_vars':                      │ │
│ │                           OrderedDict({'line_statement_prefix': '# %'}), '_extensions': ['jinja2.ext.do'], '_template':                    │ │
│ │                           '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_templat… │ │
│ │                           '_output_dir': '/home/vdusek', '_repo_dir':                                                                      │ │
│ │                           '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_templat… │ │
│ │                           '_checkout': None}), '_cookiecutter': {'project_name': 'my-crawler', 'crawler_type': ['parsel', 'beautifulsoup', │ │
│ │                           'playwright', 'playwright-camoufox'], 'http_client': ['httpx', 'curl-impersonate'], 'package_manager':           │ │
│ │                           ['poetry', 'pip', 'uv', 'manual'], 'enable_apify_integration': False, 'start_url': 'https://crawlee.dev/'}})     │ │
│ │            context_file = '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-pa'+49                            │ │
│ │   context_for_prompting = OrderedDict({'cookiecutter': OrderedDict({'project_name': 'my-crawler', '__package_name': 'my_crawler',          │ │
│ │                           'crawler_type': 'parsel', '__crawler_type': 'parsel', 'http_client': 'httpx', 'package_manager': 'poetry',       │ │
│ │                           'enable_apify_integration': False, 'start_url': 'https://crawlee.dev/', '_jinja2_env_vars':                      │ │
│ │                           OrderedDict({'line_statement_prefix': '# %'}), '_extensions': ['jinja2.ext.do'], '_template':                    │ │
│ │                           '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_templat… │ │
│ │                           '_output_dir': '/home/vdusek', '_repo_dir':                                                                      │ │
│ │                           '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_templat… │ │
│ │                           '_checkout': None}), '_cookiecutter': {'project_name': 'my-crawler', 'crawler_type': ['parsel', 'beautifulsoup', │ │
│ │                           'playwright', 'playwright-camoufox'], 'http_client': ['httpx', 'curl-impersonate'], 'package_manager':           │ │
│ │                           ['poetry', 'pip', 'uv', 'manual'], 'enable_apify_integration': False, 'start_url': 'https://crawlee.dev/'}})     │ │
│ │          default_config = False                                                                                                            │ │
│ │               directory = None                                                                                                             │ │
│ │           extra_context = {                                                                                                                │ │
│ │                           │   'project_name': 'my-crawler',                                                                                │ │
│ │                           │   'package_manager': 'poetry',                                                                                 │ │
│ │                           │   'crawler_type': 'parsel',                                                                                    │ │
│ │                           │   'http_client': 'httpx',                                                                                      │ │
│ │                           │   'enable_apify_integration': False,                                                                           │ │
│ │                           │   'start_url': 'https://crawlee.dev/'                                                                          │ │
│ │                           }                                                                                                                │ │
│ │            import_patch = <cookiecutter.main._patch_import_path_for_repo object at 0x7f25d34ecc20>                                         │ │
│ │ keep_project_on_failure = False                                                                                                            │ │
│ │                no_input = True                                                                                                             │ │
│ │              output_dir = '.'                                                                                                              │ │
│ │     overwrite_if_exists = False                                                                                                            │ │
│ │                password = None                                                                                                             │ │
│ │                  replay = None                                                                                                             │ │
│ │                repo_dir = '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-pa'+31                            │ │
│ │     skip_if_file_exists = False                                                                                                            │ │
│ │                template = '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-pa'+31                            │ │
│ │           template_name = 'project_template'                                                                                               │ │
│ ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                                                                                │
│ /home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/cookiecutter/generate.py:342 in generate_files          │
│                                                                                                                                                │
│   339 │   delete_project_on_failure = output_directory_created and not keep_project_on_failure                                                 │
│   340 │                                                                                                                                        │
│   341 │   if accept_hooks:                                                                                                                     │
│ ❱ 342 │   │   run_hook_from_repo_dir(                                                                                                          │
│   343 │   │   │   repo_dir, 'pre_gen_project', project_dir, context, delete_project_on_failure                                                 │
│   344 │   │   )                                                                                                                                │
│   345                                                                                                                                          │
│                                                                                                                                                │
│ ╭────────────────────────────────────────────────────────────────── locals ──────────────────────────────────────────────────────────────────╮ │
│ │              accept_hooks = True                                                                                                           │ │
│ │                   context = OrderedDict({'cookiecutter': OrderedDict({'project_name': 'my-crawler', '__package_name': 'my_crawler',        │ │
│ │                             'crawler_type': 'parsel', '__crawler_type': 'parsel', 'http_client': 'httpx', 'package_manager': 'poetry',     │ │
│ │                             'enable_apify_integration': False, 'start_url': 'https://crawlee.dev/', '_jinja2_env_vars':                    │ │
│ │                             OrderedDict({'line_statement_prefix': '# %'}), '_extensions': ['jinja2.ext.do'], '_template':                  │ │
│ │                             '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_templ… │ │
│ │                             '_output_dir': '/home/vdusek', '_repo_dir':                                                                    │ │
│ │                             '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_templ… │ │
│ │                             '_checkout': None}), '_cookiecutter': {'project_name': 'my-crawler', 'crawler_type': ['parsel',                │ │
│ │                             'beautifulsoup', 'playwright', 'playwright-camoufox'], 'http_client': ['httpx', 'curl-impersonate'],           │ │
│ │                             'package_manager': ['poetry', 'pip', 'uv', 'manual'], 'enable_apify_integration': False, 'start_url':          │ │
│ │                             'https://crawlee.dev/'}})                                                                                      │ │
│ │ delete_project_on_failure = True                                                                                                           │ │
│ │                       env = <cookiecutter.environment.StrictEnvironment object at 0x7f25d34942d0>                                          │ │
│ │   keep_project_on_failure = False                                                                                                          │ │
│ │                output_dir = '.'                                                                                                            │ │
│ │  output_directory_created = True                                                                                                           │ │
│ │       overwrite_if_exists = False                                                                                                          │ │
│ │               project_dir = '/home/vdusek/my-crawler'                                                                                      │ │
│ │                  repo_dir = '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-pa'+31                          │ │
│ │       skip_if_file_exists = False                                                                                                          │ │
│ │              template_dir = PosixPath('/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/pro… │ │
│ │            unrendered_dir = '{{cookiecutter.project_name}}'                                                                                │ │
│ ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                                                                                │
│ /home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/cookiecutter/hooks.py:157 in run_hook_from_repo_dir     │
│                                                                                                                                                │
│   154 │   """                                                                                                                                  │
│   155 │   with work_in(repo_dir):                                                                                                              │
│   156 │   │   try:                                                                                                                             │
│ ❱ 157 │   │   │   run_hook(hook_name, project_dir, context)                                                                                    │
│   158 │   │   except (                                                                                                                         │
│   159 │   │   │   FailedHookException,                                                                                                         │
│   160 │   │   │   UndefinedError,                                                                                                              │
│                                                                                                                                                │
│ ╭────────────────────────────────────────────────────────────────── locals ──────────────────────────────────────────────────────────────────╮ │
│ │                   context = OrderedDict({'cookiecutter': OrderedDict({'project_name': 'my-crawler', '__package_name': 'my_crawler',        │ │
│ │                             'crawler_type': 'parsel', '__crawler_type': 'parsel', 'http_client': 'httpx', 'package_manager': 'poetry',     │ │
│ │                             'enable_apify_integration': False, 'start_url': 'https://crawlee.dev/', '_jinja2_env_vars':                    │ │
│ │                             OrderedDict({'line_statement_prefix': '# %'}), '_extensions': ['jinja2.ext.do'], '_template':                  │ │
│ │                             '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_templ… │ │
│ │                             '_output_dir': '/home/vdusek', '_repo_dir':                                                                    │ │
│ │                             '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_templ… │ │
│ │                             '_checkout': None}), '_cookiecutter': {'project_name': 'my-crawler', 'crawler_type': ['parsel',                │ │
│ │                             'beautifulsoup', 'playwright', 'playwright-camoufox'], 'http_client': ['httpx', 'curl-impersonate'],           │ │
│ │                             'package_manager': ['poetry', 'pip', 'uv', 'manual'], 'enable_apify_integration': False, 'start_url':          │ │
│ │                             'https://crawlee.dev/'}})                                                                                      │ │
│ │ delete_project_on_failure = True                                                                                                           │ │
│ │                 hook_name = 'pre_gen_project'                                                                                              │ │
│ │               project_dir = '/home/vdusek/my-crawler'                                                                                      │ │
│ │                  repo_dir = '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-pa'+31                          │ │
│ ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                                                                                │
│ /home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/cookiecutter/hooks.py:140 in run_hook                   │
│                                                                                                                                                │
│   137 │   │   return                                                                                                                           │
│   138 │   logger.debug('Running hook %s', hook_name)                                                                                           │
│   139 │   for script in scripts:                                                                                                               │
│ ❱ 140 │   │   run_script_with_context(script, project_dir, context)                                                                            │
│   141                                                                                                                                          │
│   142                                                                                                                                          │
│   143 def run_hook_from_repo_dir(                                                                                                              │
│                                                                                                                                                │
│ ╭────────────────────────────────────────────────────────────────── locals ──────────────────────────────────────────────────────────────────╮ │
│ │     context = OrderedDict({'cookiecutter': OrderedDict({'project_name': 'my-crawler', '__package_name': 'my_crawler', 'crawler_type':      │ │
│ │               'parsel', '__crawler_type': 'parsel', 'http_client': 'httpx', 'package_manager': 'poetry', 'enable_apify_integration':       │ │
│ │               False, 'start_url': 'https://crawlee.dev/', '_jinja2_env_vars': OrderedDict({'line_statement_prefix': '# %'}),               │ │
│ │               '_extensions': ['jinja2.ext.do'], '_template':                                                                               │ │
│ │               '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_template',           │ │
│ │               '_output_dir': '/home/vdusek', '_repo_dir':                                                                                  │ │
│ │               '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_template',           │ │
│ │               '_checkout': None}), '_cookiecutter': {'project_name': 'my-crawler', 'crawler_type': ['parsel', 'beautifulsoup',             │ │
│ │               'playwright', 'playwright-camoufox'], 'http_client': ['httpx', 'curl-impersonate'], 'package_manager': ['poetry', 'pip',     │ │
│ │               'uv', 'manual'], 'enable_apify_integration': False, 'start_url': 'https://crawlee.dev/'}})                                   │ │
│ │   hook_name = 'pre_gen_project'                                                                                                            │ │
│ │ project_dir = '/home/vdusek/my-crawler'                                                                                                    │ │
│ │      script = '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib/python3.13/site-pack'+54                                        │ │
│ │     scripts = ['/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib/python3.13/site-pack'+54]                                      │ │
│ ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                                                                                │
│ /home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/cookiecutter/hooks.py:123 in run_script_with_context    │
│                                                                                                                                                │
│   120 │   │   output = template.render(**context)                                                                                              │
│   121 │   │   temp.write(output.encode('utf-8'))                                                                                               │
│   122 │                                                                                                                                        │
│ ❱ 123 │   run_script(temp.name, cwd)                                                                                                           │
│   124                                                                                                                                          │
│   125                                                                                                                                          │
│   126 def run_hook(hook_name, project_dir, context):                                                                                           │
│                                                                                                                                                │
│ ╭────────────────────────────────────────────────────────────────── locals ──────────────────────────────────────────────────────────────────╮ │
│ │           _ = '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib/python3.13/site-pack'+51                                        │ │
│ │    contents = "# % if cookiecutter.package_manager in ['poetry', 'uv']\nimport subprocess\nimport"+676                                     │ │
│ │     context = OrderedDict({'cookiecutter': OrderedDict({'project_name': 'my-crawler', '__package_name': 'my_crawler', 'crawler_type':      │ │
│ │               'parsel', '__crawler_type': 'parsel', 'http_client': 'httpx', 'package_manager': 'poetry', 'enable_apify_integration':       │ │
│ │               False, 'start_url': 'https://crawlee.dev/', '_jinja2_env_vars': OrderedDict({'line_statement_prefix': '# %'}),               │ │
│ │               '_extensions': ['jinja2.ext.do'], '_template':                                                                               │ │
│ │               '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_template',           │ │
│ │               '_output_dir': '/home/vdusek', '_repo_dir':                                                                                  │ │
│ │               '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/crawlee/project_template',           │ │
│ │               '_checkout': None}), '_cookiecutter': {'project_name': 'my-crawler', 'crawler_type': ['parsel', 'beautifulsoup',             │ │
│ │               'playwright', 'playwright-camoufox'], 'http_client': ['httpx', 'curl-impersonate'], 'package_manager': ['poetry', 'pip',     │ │
│ │               'uv', 'manual'], 'enable_apify_integration': False, 'start_url': 'https://crawlee.dev/'}})                                   │ │
│ │         cwd = '/home/vdusek/my-crawler'                                                                                                    │ │
│ │         env = <cookiecutter.environment.StrictEnvironment object at 0x7f25d3495810>                                                        │ │
│ │   extension = '.py'                                                                                                                        │ │
│ │        file = <_io.TextIOWrapper                                                                                                           │ │
│ │               name='/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib/python3.13/site-packages/crawlee/project_template/hooks/p… │ │
│ │               mode='r' encoding='utf-8'>                                                                                                   │ │
│ │      output = 'import subprocess\nimport re\n\nmanager = "poetry"\nmanager_text = manager.title()\nv'+431                                  │ │
│ │ script_path = '/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib/python3.13/site-pack'+54                                        │ │
│ │        temp = <tempfile._TemporaryFileWrapper object at 0x7f25d34fc6e0>                                                                    │ │
│ │    template = <Template memory:7f25d3504d10>                                                                                               │ │
│ ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                                                                                │
│ /home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/lib64/python3.13/site-packages/cookiecutter/hooks.py:94 in run_script                  │
│                                                                                                                                                │
│    91 │   │   proc = subprocess.Popen(script_command, shell=run_thru_shell, cwd=cwd)  # nosec                                                  │
│    92 │   │   exit_status = proc.wait()                                                                                                        │
│    93 │   │   if exit_status != EXIT_SUCCESS:                                                                                                  │
│ ❱  94 │   │   │   raise FailedHookException(                                                                                                   │
│    95 │   │   │   │   f'Hook script failed (exit status: {exit_status})'                                                                       │
│    96 │   │   │   )                                                                                                                            │
│    97 │   except OSError as err:                                                                                                               │
│                                                                                                                                                │
│ ╭──────────────────────────────────────────────────── locals ────────────────────────────────────────────────────╮                             │
│ │            cwd = '/home/vdusek/my-crawler'                                                                     │                             │
│ │    exit_status = 1                                                                                             │                             │
│ │           proc = <Popen: returncode: 1 args: ['/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHz...>              │                             │
│ │ run_thru_shell = False                                                                                         │                             │
│ │ script_command = ['/home/vdusek/.cache/uv/archive-v0/1qtTDcjORmHzC3iaiBkgr/bin/python', '/tmp/tmpd9wekhuy.py'] │                             │
│ │    script_path = '/tmp/tmpd9wekhuy.py'                                                                         │                             │
│ ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯                             │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
FailedHookException: Hook script failed (exit status: 1)

Example of output after for uv error:

$ crawlee create --crawler-type parsel --http-client httpx --package-manager uv --start-url 'https://crawlee.dev/' --no-apify my-crawler
⠋ Bootstrapping...
Error: You selected Uv as your package manager, but it is not installed. Please install it and try again.
Stopping generation because pre_gen_project hook script didn't exit successfully
⠋ Bootstrapping...Project creation failed. Check the error message above.

Example of output after for poetry error:

$ crawlee create --crawler-type parsel --http-client httpx --package-manager poetry --start-url 'https://crawlee.dev/' --no-apify my-crawler
⠋ Bootstrapping...
Error: You selected Poetry as your package manager, but it is not installed. Please install it and try again.
Stopping generation because pre_gen_project hook script didn't exit successfully
⠋ Bootstrapping...Project creation failed. Check the error message above.

Checklist

  • CI passed

@github-actions github-actions bot added this to the 112nd sprint - Tooling team milestone Apr 7, 2025
@github-actions github-actions bot added the t-tooling Issues with this label are in the ownership of the tooling team. label Apr 7, 2025
@vdusek vdusek requested review from janbuchar and Pijukatel April 7, 2025 13:28
@vdusek vdusek requested a review from janbuchar April 7, 2025 13:45
Copy link
Collaborator

@janbuchar janbuchar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@vdusek vdusek merged commit f2d33df into master Apr 7, 2025
23 checks passed
@vdusek vdusek deleted the update-templates branch April 7, 2025 17:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
t-tooling Issues with this label are in the ownership of the tooling team.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

CLI prints stacktraces when trying to initialize a project with missing package managers
2 participants