Sign Up
Log In
Log In
or
Sign Up
Places
All Projects
Status Monitor
Collapse sidebar
home:VaiTon:Snapshot
llamacpp
llamacpp.changes
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
File llamacpp.changes of Package llamacpp
------------------------------------------------------------------- Tue Oct 15 22:33:33 UTC 2024 - eyadlorenzo@gmail.com - Update to version 3922: * llama : add infill sampler (#9896) * server : improve infill context reuse (#9894) * sampling : add XTC sampler (#9742) * server : update preact (#9895) * readme : update bindings list (#9889) ------------------------------------------------------------------- Mon Oct 14 08:52:45 UTC 2024 - Eyad Issa <eyadlorenzo@gmail.com> - Update to version 3917: * server : handle "logprobs" field with false value (#9871) * Vectorize load instructions in dmmv f16 CUDA kernel (#9816) * server : accept extra_context for the infill endpoint (#9874) * server : reuse cached context chunks (#9866) * flake.lock: Update (#9870) ------------------------------------------------------------------- Mon Oct 14 08:16:06 UTC 2024 - Eyad Issa <eyadlorenzo@gmail.com> - Add Vulkan support ------------------------------------------------------------------- Sat Oct 12 19:43:58 UTC 2024 - Eyad Issa <eyadlorenzo@gmail.com> - Update to version 3912: * server : add option to time limit the generation phase (#9865) * server : remove self-extend features (#9860) * server : remove legacy system_prompt feature (#9857) ------------------------------------------------------------------- Sat Oct 12 14:28:06 UTC 2024 - Eyad Issa <eyadlorenzo@gmail.com> - Initial packaging
Locations
Projects
Search
Status Monitor
Help
OpenBuildService.org
Documentation
API Documentation
Code of Conduct
Contact
Support
@OBShq
Terms
openSUSE Build Service is sponsored by
The Open Build Service is an
openSUSE project
.
Sign Up
Log In
Places
Places
All Projects
Status Monitor