Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem Regarding Evaluation #15

Open
chrisliu298 opened this issue Oct 25, 2024 · 1 comment
Open

Problem Regarding Evaluation #15

chrisliu298 opened this issue Oct 25, 2024 · 1 comment

Comments

@chrisliu298
Copy link

Hi,

Thank you for developing this fantastic dataset for code generation!

I had a quick question about the evaluation script (compute_metric.py). In my experience, it takes about 17 hours complete, even with the provided example generation. When I tried switching to parallel evaluation, the process finished much faster (in seconds), but all the results returned as zero. During evaluation, I used the original code an generated solutions provided in this repo.

Is this expected behavior, or am I missing something? I would appreciate any guidance you could provide.

@sssszh
Copy link

sssszh commented Nov 27, 2024

@chrisliu298 Hi, have you solved this problem? I’ve encountered the same issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants