Fedora Packages

python3-ramalama Subpackage of python-ramalama

RamaLama is a command line tool for working with AI LLM models

RamaLama is a command line tool for working with AI LLM models On first run RamaLama inspects your system for GPU support, falling back to CPU support if no GPUs are present. It then uses container engines like Podman to pull the appropriate OCI image with all of the software necessary to run an AI Model for your systems setup. This eliminates the need for the user to configure the system for AI themselves. After the initialization, RamaLama will run the AI Models within a container based on the OCI image.

Releases Overview

Release Stable Testing
Fedora Rawhide 0.6.1-1.fc43 -
Fedora 42 0.6.0-1.fc42 -
Fedora 41 0.5.5-1.fc41 0.6.0-1.fc41
Fedora 40 0.5.5-1.fc40 0.6.0-1.fc40
Fedora EPEL 9 0.5.2-1.el9 0.6.0-1.el9
File a new bug report »
Package Info

You can contact the maintainers of this package via email at python-ramalama dash maintainers at fedoraproject dot org.



Sources on Pagure