Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • Home
  • SEARCH
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 477155
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 13, 20262026-05-13T00:33:45+00:00 2026-05-13T00:33:45+00:00

I’m writing an application in Python with Postgresql 8.3 which runs on several machines

  • 0

I’m writing an application in Python with Postgresql 8.3 which runs on several machines on a local network.

All machines

1) fetch huge amount of data from the database server ( lets say database gets 100 different queries from a machine with in 2 seconds time) and there are about 10 or 11 machines doing that.

2) After processing data machines have to update certain tables (about 3 or 4 update/insert queries per machine per 1.5 seconds).

What I have noticed is that database goes down some times by giving server aborted process abnormally or freezes the server machine (requiring a hard reset).

By the way all machines maintain a constant connection to the database at all times i.e. once a connection is made using Psycopg2 (in Python) it remains active until processing finishes (which could last hours).

What’s the best / optimal way for handling large number of connections in the application, should they be destroyed after each query ?

Secondly should I increase max_connections ?

Would greatly appreciate any advice on this matter.

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-13T00:33:45+00:00Added an answer on May 13, 2026 at 12:33 am

    The most likely cause indeed sounds like running out of memory. If these are Linux servers, triggering an out-of-memory condition invokes the “OOM-killer”, which simply terminates the memory hog processes (hence “server aborted process abnormally”). A low-memory situation often means very high disk swapping/paging load, which makes the server seem unresponsive.

    See your kernel log files (or the dmesg command) for anything resembling “Out of Memory: Killed process 1234 (postgres)“. This is caused by the default that permits the kernel to overcommit memory. The first thing you should do is disable overcommit, to allow graceful handling of out-of-memory situations:

    echo 2 > /proc/sys/vm/overcommit_memory
    

    Plan A:

    A likely culprit is the work_mem setting which specifies how much memory each individual operation can allocate. One query may consist of multiple memory-intensive steps, so each backend can allocate a few times the work_mem amount of memory, in addition to the global shared_buffers setting. In addition, you also need some free memory for operating system cache.

    For more info see the PostgreSQL manual on resource consumption settings: PostgreSQL 8.3 Documentation, Resource Consumption

    Plan B:

    It might be that reducing these tunables slows your queries down so much that you will still get no work done. An alternative to this is artificially limiting the number of queries that can run in parallel. Many connection pooling middlewares for PostgreSQL can limit the number of parallel queries, and provide queueing instead. Examples of this software are pgbouncer (simpler) and pgpool-II (more flexible).

    EDIT: Answering your questions:

    What’s the best / optimal way for handling large number of connections in the application, should they be destroyed after each query ?

    In general, establishing new connections to PostgreSQL is not fast because PostgreSQL spawns a new process for each backend. However, processes are not cheap in terms of memory, so keeping many idle connections to the database is not a good idea.

    The connection pooling middlewares I mentioned in Plan B will take care of keeping a reasonable number of connections to Postgres — regardless of when or how often you connect or disconnect from the pooler. So if you choose that route, you don’t need to worry about manually opening/closing connections.

    Secondly should I increase max_connections ?

    Unless your database server has large amounts of RAM (over 8GB) I would not go over the default limit of 100 connections.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 278k
  • Answers 278k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer TextBox.SelectionStart should work for you. Setting TextBox.SelectionLength will define the… May 13, 2026 at 3:14 pm
  • Editorial Team
    Editorial Team added an answer After talking though and answering this question and seeing the… May 13, 2026 at 3:14 pm
  • Editorial Team
    Editorial Team added an answer No, because ASP.NET code is run and compiled before you… May 13, 2026 at 3:14 pm

Related Questions

I ran into a problem. Wrote the following code snippet: teksti = teksti.Trim() teksti
I'm trying to decode HTML entries from here NYTimes.com and I cannot figure out
I want use html5's new tag to play a wav file (currently only supported
I've got a string that has curly quotes in it. I'd like to replace
In order to apply a triggered animation to all ToolTip s in my app,

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.