Create a Robots.txt File to Avoid Duplicate Content and Boost WordPress SEO

Create a Robots.txt File to Avoid Duplicate Content and Boost WordPress SEO

Robots.txt is a simple text file that you upload to your server root. It contains instructions for search engines indexing your site, ie. follow that link, don’t index this directory, etc.

You can get help creating a WordPress-specific robots.txt file here: Robots.txt For WordPress.

Discussion is open below in the comments if you want to add to this tip.

FREE EBOOK
Your step-by-step roadmap to a profitable web dev business. From landing more clients to scaling like crazy.

By downloading this ebook I consent to occasionally receive emails from WPMU DEV.
We keep your email 100% private and do not spam.

FREE EBOOK
Plan, build, and launch your next WP site without a hitch. Our checklist makes the process easy and repeatable.

By downloading this ebook I consent to occasionally receive emails from WPMU DEV.
We keep your email 100% private and do not spam.

Want to Submit a Daily Tip to WPMUDEV?

If you’ve got a great tip for WordPress, WordPress Multisite, or BuddyPress users, send it our way on Twitter: @wpmudev and we’ll happily credit you. Create a tweetable title and let us know if you have a more info or an article you’d like to link it to.

Tags: