Robots.txt

A file that gives web crawlers instructions on which pages you want them to crawl/ignore on your site. This can be useful for preventing duplicate content from being indexed in search engines, but isn’t a good way to hide information.

A new way to learn and to build for the web

Get the best, coolest, and latest in design and no-code delivered to your inbox each week.

Shoot, something didn't work. Try again later, bud.