• Blog
  • AI Security
  • Technology
  • Leadership
  • Talks & Presentations
  • Videos
  • About Me

LLM plugins

Local AI on Linux: Alternatives to Mac’s MLX Using LLM and Ollama

June 7, 2025 Lothar Schulz 0

Want to run AI models locally on Linux? This guide walks you through setting up the llm CLI and Ollama to power your own private AI environment—no Apple Silicon required. Learn how to install tools, pull models, and start chatting with AI directly from your terminal using open and efficient Linux-native solutions.

Find

social

linkedin github presentations mastodon community rss feed blue sky

Popular

  • Two people representing two software architecture patterns: CDC and Outbox. Building Real Time Data Pipelines with CDC, Outbox, PostgreSQL, and Apache NiFi
  • Sunset sky representing secure GPG Git commit signing verification workflow How to Verify Your GPG Git Setup for Signed Commits in 3 Minutes
  • Battle of the AI Coding Agents: GitHub Copilot vs Claude Code vs Cursor vs Windsurf vs Kiro vs Gemini CLI
  • Bug Resolved: .htaccess Trailing Slash Redirects caused WordPress 6.9 “bug”
  • WordPress 6.9 Bug: Post Content Not Saving in Gutenberg Editor
  • Choosing the Right LLM: Systematic Model Evaluation with MLflow
  • Getting Started with Google Gemini CLI: Complete Setup Guide and Rust Testing Experience

Copyright © 2003 - 2026 | lotharschulz.info