Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: __AVX2__ missing #10154

Closed
Djip007 opened this issue Nov 4, 2024 · 0 comments · Fixed by #10164
Closed

Bug: __AVX2__ missing #10154

Djip007 opened this issue Nov 4, 2024 · 0 comments · Fixed by #10164
Labels
bug-unconfirmed medium severity Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)

Comments

@Djip007
Copy link
Contributor

Djip007 commented Nov 4, 2024

What happened?

if (ggml_cpu_has_avx2()) {

there is a #ifdef __AVX2__ missing this if is no more call.
Or the #ifdef AVXnnn have to be replace with something like __x86_64__ but I don't know if it can work.

Name and Version

b4020 tag

What operating system are you seeing the problem on?

No response

Relevant log output

No response

@Djip007 Djip007 added bug-unconfirmed medium severity Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable) labels Nov 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-unconfirmed medium severity Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant