Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • SEARCH
  • Home
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 43271
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 10, 20262026-05-10T15:28:09+00:00 2026-05-10T15:28:09+00:00

I want to read the contents of a URL but don’t want to hang

  • 0

I want to read the contents of a URL but don’t want to ‘hang’ if the URL is unresponsive. I’ve created a BufferedReader using the URL…

URL theURL = new URL(url); URLConnection urlConn = theURL.openConnection(); urlConn.setDoOutput(true); BufferedReader urlReader = new BufferedReader(newInputStreamReader(urlConn.getInputStream())); 

…and then begun the loop to read the contents…

do     {     buf = urlReader.readLine();     if (buf != null)         {         resultBuffer.append(buf);         resultBuffer.append('\n');         }     } while (buf != null); 

…but if the read hangs then the application hangs.

Is there a way, without grinding the code down to the socket level, to ‘time out’ the read if necessary?

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. 2026-05-10T15:28:09+00:00Added an answer on May 10, 2026 at 3:28 pm

    I think URLConnection.setReadTimeout is what you are looking for.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 70k
  • Answers 70k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • added an answer The only way we've been successful is to include the… May 11, 2026 at 1:01 pm
  • added an answer Yes but for certain behaviour you may have to use… May 11, 2026 at 1:01 pm
  • added an answer Answer: use the link local address. This can be formed… May 11, 2026 at 1:01 pm

Related Questions

I want to read the contents of a URL but don't want to hang
I have an INI file that I want to read the line DeviceName=PHJ01444-MC35 from
which one of the two is more spread? I want to read out the
I have a file that I want to read in using the File::Slurp module
I want to read line n1->n2 from file foo.c into the current buffer. I
I've previously used jquery-ui tabs extension to load page fragments via ajax , and
I want to run a web application on php and mysql, using the CakePHP
A client of ours is a membership organization and they are looking for functionality
What I'm trying to do here is get the headers of a given URL

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.