What happens when we stop using validations for data integrity and instead use them for user interface?
Search engines “crawl” and “index” web content through programs called robots (a.k.a. crawlers or spiders). Here are some approaches to blocking them in Ruby on Rails apps.
bundle outdated can be useful for keeping your
bundler dependencies updated.
Getting started with Action Mailer and Active Job.
A fake to test your Stripe code without hitting Stripe’s servers.
You may not get the results you expected.