Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • Home
  • SEARCH
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 294287
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 12, 20262026-05-12T06:21:11+00:00 2026-05-12T06:21:11+00:00

I’m writing a web crawler for a specific site. The application is a VB.Net

  • 0

I’m writing a web crawler for a specific site. The application is a VB.Net Windows Forms application that is not using multiple threads – each web request is consecutive. However, after ten successful page retrievals every successive request times out.

I have reviewed the similar questions already posted here on SO, and have implemented the recommended techniques into my GetPage routine, shown below:

Public Function GetPage(ByVal url As String) As String
    Dim result As String = String.Empty

    Dim uri As New Uri(url)
    Dim sp As ServicePoint = ServicePointManager.FindServicePoint(uri)
    sp.ConnectionLimit = 100

    Dim request As HttpWebRequest = WebRequest.Create(uri)
    request.KeepAlive = False
    request.Timeout = 15000

    Try
        Using response As HttpWebResponse = DirectCast(request.GetResponse, HttpWebResponse)
            Using dataStream As Stream = response.GetResponseStream()
                Using reader As New StreamReader(dataStream)
                    If response.StatusCode <> HttpStatusCode.OK Then
                        Throw New Exception("Got response status code: " + response.StatusCode)
                    End If
                    result = reader.ReadToEnd()
                End Using
            End Using
            response.Close()
        End Using

    Catch ex As Exception
        Dim msg As String = "Error reading page """ & url & """. " & ex.Message
        Logger.LogMessage(msg, LogOutputLevel.Diagnostics)
    End Try

    Return result

End Function

Have I missed something? Am I not closing or disposing of an object that should be? It seems strange that it always happens after ten consecutive requests.

Notes:

  1. In the constructor for the class in which this method resides I have the following:

    ServicePointManager.DefaultConnectionLimit = 100

  2. If I set KeepAlive to true, the timeouts begin after five requests.

  3. All the requests are for pages in the same domain.

EDIT

I added a delay between each web request of between two and seven seconds so that I do not appear to be “hammering” the site or attempting a DOS attack. However, the problem still occurs.

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-12T06:21:12+00:00Added an answer on May 12, 2026 at 6:21 am

    I think the site has some sort of DOS protection, which kicks in when it’s hit with a number of rapis requests. You may want to try setting the UserAgent on the webrequest.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.