Skip to main content
A curated list of high-quality tools, communities, and guides. Navigating the rapidly exploding world of Generative AI can be overwhelming. We have curated a list of the resources we use daily at Sentri Cloud to stay ahead of the curve. These links focus heavily on Open Source, Self-Hosted, and Privacy-First AI.

🛠️ The Core Stack (Tools We Deploy)

These are the foundational tools we use in our Private AI deployments.
  • Ollama
    • The “Docker for LLMs.” The easiest way to get up and running with large language models locally on macOS, Linux, and Windows.
  • n8n
    • Workflow Automation. A fair-code alternative to Zapier/Make that is incredibly powerful for chaining AI agents together.
  • Open WebUI (formerly Ollama WebUI)
    • The Interface. An extensible, self-hosted UI that mimics the ChatGPT experience but connects to your private models.

📚 Learning & Documentation

  • LocalLlama Subreddit (r/LocalLLaMA)
    • The Community Hub. The single best place for bleeding-edge discussions on hardware, model quantization, and fine-tuning.
  • DeepLearning.AI Short Courses
    • Free Education. High-quality, 1-hour courses by Andrew Ng and industry partners (LangChain, Microsoft) on building with LLMs.
  • Hugging Face Models
    • The “GitHub of AI.” Browse thousands of open-source models. Look for “GGUF” formats for easy local hosting.

🤖 Model Leaderboards & News

💻 Hardware for Self-Hosting

🚀 Starter Prompts & Engineering

  • Anthropic Prompt Library
    • Best Practices. Even if you don’t use Claude, their prompt engineering guides are industry-leading and applicable to Llama 3/Mistral.
  • Learn Prompting
    • Comprehensive Course. A free, open-source course on communicating effectively with AI.
Found a dead link or have a suggestion? Let us know.