• Blog
  • Leadership
  • Technology
  • Talks & Presentations
  • Videos
  • About Me

Linux

Local AI on Linux: Alternatives to Mac’s MLX Using LLM and Ollama

June 7, 2025 Lothar Schulz 0

Want to run AI models locally on Linux? This guide walks you through setting up the llm CLI and Ollama to power your own private AI environment—no Apple Silicon required. Learn how to install tools, pull models, and start chatting with AI directly from your terminal using open and efficient Linux-native solutions.

linux disk full – lsof does not help

May 10, 2017 Lothar Schulz 0

Did you ever experience a full disk on (x)ubuntu? I did recently and wondered and recalled like many others about differences between du and df […]

Find

social

linkedin github presentations mastodon community rss feed blue sky

Popular

  • Local AI on Linux: Alternatives to Mac’s MLX Using LLM and Ollama
  • Rust MCP Local Server: Bridging Rust Logic with AI Frontends
  • Claude 4’s 25% Syntax Error Reduction
  • The hidden poison in AI-generated code: How vibecoding enables slopsquatting attacks
  • Engineering Metrics Frameworks: DORA, DevEx, SPACE, DX Core 4, ESSP Comparison
  • Slopsquatting
  • Change Java version on Mac

Copyright © 2003 - 2024 | lotharschulz.info