Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • Home
  • SEARCH
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 132685
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 11, 20262026-05-11T06:18:06+00:00 2026-05-11T06:18:06+00:00

I’ve been thinking a while about disallowing every crawler except Ask, Google, Microsoft, and

  • 0

I’ve been thinking a while about disallowing every crawler except Ask, Google, Microsoft, and Yahoo! from my site.

The reasoning behind this is that I’ve never seen any traffic being generated by any of the other web-crawlers out there.

My questions are:

  1. Is there any reason not to?
  2. Has anybody done this?
  3. Did you notice any negative effects?

Update:
Up till now I used the blacklist approach: if I do not like the crawler, I add them to the disallow list.
I’m no fan of blacklisting however as this is a never ending story: there are always more crawlers out there.

I’m no so much worried about the real ugly misbehaving crawlers, they are detected and blocked automatically. (and they typically do no ask for robots.txt anyhow 🙂

However, many crawlers are not really misbehaving in any way, they just do not seem to generate any value for me / my customers.
There are for example a couple of crawlers that power website who claim they will be The Next Google; Only Better. I’ve never seen any traffic coming from them and I’m quite sceptical about them becoming better than any of the four search engines mentioned above.

Update 2:
I’ve been analysing the traffic to several sites for some time now, and it seems that for reasonable small sites, 100 unique human visitors a day (=visitors that I cannot identify as being not human). About 52% of the generated traffic is by automated processes.

60% of all automated visitors is not reading robots.txt, 40% (21% of total traffic) does request robots.txt. (this includes Ask, Google, Microsoft, and Yahoo!)

So my thinking is, If I block all the well behaved crawlers that do not seem to generate any value for me, I could reduce the bandwidth use and server load by around 12% – 17%.

  • 1 1 Answer
  • 2 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. 2026-05-11T06:18:07+00:00Added an answer on May 11, 2026 at 6:18 am

    The internet is a publishing mechanism. If you want to whitelist your site, you’re against the grain, but that’s fine.

    Do you want to whitelist your site?

    Bear in mind that badly behaved bots which ignore robots.txt aren’t affected anyway (obviously), and well behaved bots are probably there for a good reason, it’s just that that’s opaque to you.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 182k
  • Answers 182k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer One solution might be using a regular expression. I quickly… May 12, 2026 at 4:32 pm
  • Editorial Team
    Editorial Team added an answer C# is a little closer to Java and C++ than… May 12, 2026 at 4:32 pm
  • Editorial Team
    Editorial Team added an answer Set the CustomValidator's ValidateEmptyText property to true. Otherwise validation won't… May 12, 2026 at 4:32 pm

Related Questions

I'm trying to decode HTML entries from here NYTimes.com and I cannot figure out
I ran into a problem. Wrote the following code snippet: teksti = teksti.Trim() teksti
I have a French site that I want to parse, but am running into
I have text I am displaying in SIlverlight that is coming from a CMS
I am currently running into a problem where an element is coming back from

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.