Machine Learning in Linux: Alpaca – chat with local AI models

Summary

Alpaca is a great Ollama client. Its model manager makes it very easy to pull models direct from the app. And GPU acceleration works out of the box although we only tested with NVIDIA cards.

The software also offers the ability to append YouTube transcripts, append text from a website, as well as PDF recognition. There’s an option to use a remote connection to Ollama, and run Alpaca in the background. We can tweak the temperature of the model, the seed, and the keep alive time.

The software is under active development and the developer plans a substantial rewrite. We’ll definitely follow developments closely.

Naturally there are areas which would benefit from refinement. For example, the software would benefit by polishing the presentation of complex mathematical equations.

Website: jeffser.com/alpaca
Support: GitHub Code Repository
Developer: Jeffry Samuel
License: GNU General Public License v3.0

Alpaca is written in Python. Learn Python with our recommended free books and free tutorials.

Artificial intelligence icon For other useful open source apps that use machine learning/deep learning, we’ve compiled this roundup.

Pages in this article:
Page 1 – Introduction and Installation
Page 2 – In Operation
Page 3 – Image Recognition, Code Highlighting
Page 4 – Summary

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Please read our FAQ before making a comment.

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments