Sign Up
Log In
Log In
or
Sign Up
Places
All Projects
Status Monitor
Collapse sidebar
home:birdwatcher:machinelearning
ollama
ollama.spec
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
File ollama.spec of Package ollama
# # spec file for package ollama # # Copyright (c) 2024 SUSE LLC # # All modifications and additions to the file contributed by third parties # remain the property of their copyright owners, unless otherwise agreed # upon. The license for this file, and modifications and additions to the # file, is the same license as for the pristine package itself (unless the # license for the pristine package is not an Open Source License, in which # case the license is the MIT License). An "Open Source License" is a # license that conforms to the Open Source Definition (Version 1.9) # published by the Open Source Initiative. # Please submit bugfixes or comments via https://bugs.opensuse.org/ # %global rocm_major 6 %global rocm_minor 2 %global rocm_release %{rocm_major}.%{rocm_minor} Name: ollama-rocm-%{rocm_major}-%{rocm_minor} Version: 0.4.2 Release: 0 Summary: Tool for running AI models on-premise License: MIT URL: https://ollama.com Source: ollama-%{version}.tar Source1: vendor.tar.zstd Source2: ollama.service Source3: ollama-user.conf Patch0: ollama-pr7499.patch Patch1: ollama-add-install-targets.patch Patch2: ollama-use-external-cc.patch Patch3: ollama-verbose-tests.patch Patch4: ollama-lib64-runner-path.patch BuildRequires: cmake >= 3.24 BuildRequires: git BuildRequires: ninja BuildRequires: sysuser-tools BuildRequires: zstd BuildRequires: golang(API) >= 1.22 %sysusers_requires %if 0%{?sle_version} == 150600 BuildRequires: gcc12-c++ BuildRequires: libstdc++6-gcc12 %else BuildRequires: gcc-c++ >= 11.4.0 %endif BuildRequires: rocm-hip-devel BuildRequires: rocm-rpm-macros BuildRequires: rocm-release(clang) = %{rocm_release} BuildRequires: rocm-release(lld) = %{rocm_release} BuildRequires: rocm-release(llvm) = %{rocm_release} BuildRequires: rocm-release(rocblas-devel) = %{rocm_release} BuildRequires: rocm-release(rocm-runtime-devel) = %{rocm_release} BuildRequires: rocm-release(rocprim-headers) = %{rocm_release} BuildRequires: rocm-release(rocsolver-devel) = %{rocm_release} BuildRequires: rocm-release(rocsparse-devel) = %{rocm_release} BuildRequires: cmake(COMgr-ROCm) = %{rocm_release} BuildRequires: rocm-release(hipblas-devel) = %{rocm_release} BuildRequires: pkgconfig(hsakmt) BuildRequires: pkgconfig(libdrm) BuildRequires: pkgconfig(libdrm_amdgpu) Requires: rocm-release(clang-libs) = %{rocm_release} Requires: rocm-release(llvm-libs) = %{rocm_release} Requires: rocm-release(rocblas) = %{rocm_release} Requires: rocm-release(rocm-runtime) = %{rocm_release} Requires: rocm-release(rocsolver) = %{rocm_release} Requires: rocm-release(rocsparse) = %{rocm_release} Requires: rocm-release(hipblas) = %{rocm_release} Conflicts: ollama # 32bit seems not to be supported anymore ExcludeArch: %ix86 %arm %description Ollama is a tool for running AI models on one's own hardware. It offers a command-line interface and a RESTful API. New models can be created or existing ones modified in the Ollama library using the Modelfile syntax. Source model weights found on Hugging Face and similar sites can be imported. %prep %autosetup -a1 -p1 -n ollama-%{version} %build %sysusers_generate_pre %{SOURCE3} ollama ollama-user.conf %ifnarch ppc64 export GOFLAGS="-buildmode=pie -mod=vendor" %endif %if 0%{?sle_version} == 150600 export CXX=g++-12 export CC=gcc-12 # pie doesn't work with gcc12 on leap export GOFLAGS="-mod=vendor" %endif %make_build HIP_PATH=/usr HIP_ARCHS="%{list_sep rocm_gpu_list ;}" rocm exe %install %__make HIP_PATH=/usr HIP_ARCHS="%{list_sep rocm_gpu_list ;}" DESTDIR=%{buildroot} install_rocm install_ollama install -D -m 0644 %{SOURCE2} %{buildroot}%{_unitdir}/ollama.service install -D -m 0644 %{SOURCE3} %{buildroot}%{_sysusersdir}/ollama-user.conf install -d %{buildroot}%{_localstatedir}/lib/ollama mkdir -p "%{buildroot}/%{_docdir}/ollama" cp -Ra docs/* "%{buildroot}/%{_docdir}/ollama" %check %if 0%{?sle_version} == 150600 export CXX=g++-12 export CC=gcc-12 # pie doesn't work with gcc12 on leap export GOFLAGS="-mod=vendor" %endif %__make test %pre -f ollama.pre %service_add_pre ollama.service %post %service_add_post ollama.service %preun %service_del_preun ollama.service %postun %service_del_postun ollama.service %files %doc README.md %dir %{_libdir}/ollama/ %dir %{_libdir}/ollama/runners/ %license LICENSE %{_docdir}/ollama %{_bindir}/ollama %{_libdir}/ollama/runners/* %{_unitdir}/ollama.service %{_sysusersdir}/ollama-user.conf %attr(-, ollama, ollama) %{_localstatedir}/lib/ollama %changelog
Locations
Projects
Search
Status Monitor
Help
OpenBuildService.org
Documentation
API Documentation
Code of Conduct
Contact
Support
@OBShq
Terms
openSUSE Build Service is sponsored by
The Open Build Service is an
openSUSE project
.
Sign Up
Log In
Places
Places
All Projects
Status Monitor