[PRQ#75683] Merge Request for llama.cpp-cuda-f16
26 Aug
2025
26 Aug
'25
10:26 a.m.
envolution [1] filed a request to merge llama.cpp-cuda-f16 [2] into llama.cpp-cuda [3]: upstream now detects f16 mode automatically, negating the need for this package [1] https://aur.archlinux.org/account/envolution/ [2] https://aur.archlinux.org/pkgbase/llama.cpp-cuda-f16/ [3] https://aur.archlinux.org/pkgbase/llama.cpp-cuda/
28 Aug
28 Aug
9:53 a.m.
New subject: [PRQ#75683] Merge Request for llama.cpp-cuda-f16 Accepted
Request #75683 has been Accepted by Muflone [1]: [Autogenerated] Accepted merge for llama.cpp-cuda-f16 into llama.cpp- cuda. [1] https://aur.archlinux.org/account/Muflone/
1
Age (days ago)
3
Last active (days ago)
1 comments
1 participants
participants (1)
-
notify@aur.archlinux.org