Skip to main content
thoughtbot thoughtbot
  • Live on Twitch!

    thoughtbot is livestreaming

    Work alongside the thoughtbot team as we collaborate with each other and our clients, live. Ask us anything, we're live right now!

  • Case Studies
  • Blog
  • Let’s Talk
Live on Twitch!

thoughtbot is livestreaming

Work alongside the thoughtbot team as we collaborate with each other and our clients, live. Ask us anything, we're live right now!

Let’s get started!
View all Services
Development
  • Ruby on Rails
  • Hotwire
  • AI and Machine Learning
  • Maintenance
  • Mobile
Design
  • Discovery Sprints
  • UX, UI, and Product Design
  • Design Systems
Product
  • MVP
  • Product Management
  • Accessibility
Team and Processes
  • Team Augmentation
  • Fractional Leadership
View all Services
View all Resources
Development
  • Tech Leadership Resources
  • Open Source
  • Books
  • The Bike Shed Podcast
  • Live Streaming on YouTube
  • Conference Talks
The business of great software
  • Playbook
  • Purpose Built
  • Giant Robots Smashing Into Other Giant Robots Podcast
  • Design Sprint Guide
  • Live Streaming on LinkedIn
View all Resources

Ollama Articles

Written by thoughtbot, your expert partner for design and development.

    • All Topics
    • Design
    • Development
    • Product
    • More topics
  1. AI in Focus: Connecting DeepSeek to Rails

    Experimenting with a locally running version of the DeepSeek R1 LLM model in a real Rails app using Ollama and ruby-openai.

    Chad Pytel and Kate Young
    April 24, 2025
    • AI
    • Ethics
    • Product
    • Language Models
    • Ollama
    • Rails
    • Llm
    • Large Language Models
  2. How to use ngrok and Ollama to access a local LLM remotely

    Use ngrok to securely expose your local Ollama instance to the internet, making it accessible from anywhere.

    Jose Blanco
    March 11, 2025
    • Artificial Intelligence
    • Ollama
    • Ngrok
    • Llm
    • AI
    • Development
  3. Understanding open source LLMs

    Do you think you can run any Large Language Model (LLM) on your machine?

    Rakesh Arunachalam
    June 17, 2024
    • Open Source
    • Artificial Intelligence
    • Language Models
    • Ollama
  4. How to use an open source LLM model locally and remotely

    Use Ollama to run an open source large language model on your local machine and on a Digital Ocean remote virtual machine.

    Jose Blanco and Kate Young
    February 8, 2024
    • Open Source
    • Artificial Intelligence
    • Language Models
    • Ollama
  5. Sign up to receive a weekly recap from thoughtbot

    Looking for even more ways to stay connected?
    RSS feed icon Check out our feeds

Footer

thoughtbot
  • Services
  • Case Studies
  • Resources
  • Let's Talk
  • Our Company
  • Careers
  • Purpose
  • Blog
  • Sponsor
  • Mastodon
  • Bluesky
  • GitHub
  • YouTube
  • Twitch
  • Feeds
© 2025 thoughtbot, inc.

The design of a robot and thoughtbot are registered trademarks of thoughtbot, inc.

  • US: +1 (877) 9-ROBOTS
  • UK: +44 (0)20 3807 0560
  • Beware of fraudulent thoughtbot job listings Learn more
  • Code of Conduct
  • Accessibility Statement
  • Privacy Policy