-
Notifications
You must be signed in to change notification settings - Fork 113
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model is not supported: llama3_2_3b_instruct_q40 #131
Comments
Could you provide what error you have? And what command in the CLI? The correct command: |
Sure, I want to run it in a docker container, here is the commands inside the docker file: , and this is the output of build command: 16 [13/13] RUN echo "y" | python3 launch.py llama3-2-3b-instruct-q40
|
I don't see any error, also But I added a new option
BTW: the model name uses underscores, not dashesh. |
Thank you for your response and the modification. |
Hmm... maybe by some reason you are using the old version of the repository. Could you try the below commands?
|
Thank you for your guidance. |
Hi,
I want to work with the newly added model llama3_2_3b_instruct_q40, but it shows me error when downloading the model in docker container. I checked the launch.py and the issue is caused by this line:
modelName = sys.argv[1].replace('-', '_')
if modelName not in MODELS:
print(f'Model is not supported: {modelName}')
exit(1)
I appreciate your guidance on this problem.
Thanks
The text was updated successfully, but these errors were encountered: