Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • SEARCH
  • Home
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 145675
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 11, 20262026-05-11T08:27:56+00:00 2026-05-11T08:27:56+00:00

Short question: Has anybody got any C# code to parse robots.txt and then evaluate

  • 0

Short question:

Has anybody got any C# code to parse robots.txt and then evaluate URLS against it so see if they would be excluded or not.

Long question:

I have been creating a sitemap for a new site yet to be released to google. The sitemap has two modes, a user mode (like a traditional sitemap) and an ‘admin’ mode.

The admin mode will show all possible URLS on the site, including customized entry URLS or URLS for a specific outside partner – such as example.com/oprah for anyone who sees our site on Oprah. I want to track published links somewhere other than in an Excel spreadsheet.

I would have to assume that someone might publish the /oprah link on their blog or somewhere. We don’t actually want this ‘mini-oprah site’ to be indexed because it would result in non-oprah viewers being able to find the special Oprah offers.

So at the same time I was creating the sitemap I also added URLS such as /oprah to be excluded from our robots.txt file.

Then (and this is the actual question) I thought ‘wouldn’t it be nice to be able to show on the sitemap whether or not files are indexed and visible to robots’. This would be quite simple – just parse robots.txt and then evaluate a link against it.

However this is a ‘bonus feature’ and I certainly don’t have time to go off and write it (even thought its probably not that complex) – so I was wondering if anyone has already written any code to parse robots.txt ?

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. 2026-05-11T08:27:57+00:00Added an answer on May 11, 2026 at 8:27 am

    Hate to say that, but just google ‘C# robots.txt parser’ and click the first hit. It’s a CodeProject article about a simple search engine implemented in C# called ‘Searcharoo’, and it contains a class Searcharoo.Indexer.RobotsTxt, described as:

    1. Check for, and if present, download and parse the robots.txt file on the site
    2. Provide an interface for the Spider to check each Url against the robots.txt rules
    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 120k
  • Answers 120k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer See this: http://blogs.msdn.com/ie/archive/2009/05/06/session-cookies-sessionstorage-and-ie8.aspx for info on session merging. As an… May 12, 2026 at 12:09 am
  • Editorial Team
    Editorial Team added an answer insert into workorder (productid) select productid from product where productid… May 12, 2026 at 12:09 am
  • Editorial Team
    Editorial Team added an answer http://www.php.net/manual/en/function.dns-get-record.php is the function in php it sounds like you… May 12, 2026 at 12:09 am

Related Questions

I would like to design very high availability(never take server down, roll out features
confirmation = property(_get_confirmation, _set_confirmation) confirmation.short_description = Confirmation When I try the above I get
I'm currently trying to extend a friend's OCaml program. It's a huge collection of
I've got some library code that works on a range of .NET runtimes (regular,

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.