Search engines “crawl” and “index” web content through programs called robots (a.k.a. crawlers or spiders). Here are some approaches to blocking them in Ruby on Rails apps.
bundle outdated can be useful for keeping your
bundler dependencies updated.
Getting started with Action Mailer and Active Job.
A fake to test your Stripe code without hitting Stripe’s servers.
You may not get the results you expected.
Master ActiveRecord and take full advantage of your database with our newest course on Upcase.
How to prepare your Rails application so you won’t get left out in the cold.