On Mon, 2024-09-30 at 11:18 -0400, Daniel Walsh wrote: > RamaLama is an open source competitor to Ollama. The goal is to make the > use of AI Models as simple as Podman or Docker. But able to support any > AI Model registry. HuggingFace, Ollama as well as OCI Registries > (quay.io, docker hug, artifactory ...) > > It uses either Podman or Docker under the hood to run your AI Models in > containers, but can also run containers natively on the host. > > We are looking for contributors in any form, but really could use some > help getting it packaged for Fedora, PyPy and Brew for Macs. > > We have setup a discord room for discussions on RamaLama. > https://t.co/wdJ2KWJ9de <https://t.co/wdJ2KWJ9de> > > The code is all written in Python. > > Join the initiative to make running Open Source AI Models simple and boring. Having a quick look at it...I assume for packaging purposes we should avoid that yoiks-inducing `install.py` like the plague? Is the setup.py file sufficient to install it properly in a normal way? On the face of it it doesn't look like it would be, but maybe I'm missing something. Given that we're in the 2020s, why doesn't it have a pyproject.toml ? Thanks! -- Adam Williamson (he/him/his) Fedora QA Fedora Chat: @adamwill:fedora.im | Mastodon: @adamw@xxxxxxxxxxxxx https://www.happyassassin.net -- _______________________________________________ devel mailing list -- devel@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe send an email to devel-leave@xxxxxxxxxxxxxxxxxxxxxxx Fedora Code of Conduct: https://docs.fedoraproject.org/en-US/project/code-of-conduct/ List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines List Archives: https://lists.fedoraproject.org/archives/list/devel@xxxxxxxxxxxxxxxxxxxxxxx Do not reply to spam, report it: https://pagure.io/fedora-infrastructure/new_issue