Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • SEARCH
  • Home
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 514989
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 13, 20262026-05-13T07:34:54+00:00 2026-05-13T07:34:54+00:00

I have a queue in which I can enqueue different threads, so I can

  • 0

I have a queue in which I can enqueue different threads, so I can assure two things:

  1. Request are processed one by one.
  2. Request are processed in the arriving order

Second point is important. Otherwise a simple critical section would be enough.
I have different groups of requests and only inside a single group these points must be fulfilled. Requests from different groups can run concurrent.

It looks like this:

FTaskQueue.Enqueu('MyGroup');
try
  Do Something (running in context of some thread)
finally
  FTaskQueue.Dequeu('MyGroup');
end;

EDIT: I have removed the actual implementation because it hides the problem I want to solve

I need this because I have an Indy based web server that accepts http requests. First I find a coresponding session for the request. Then the request (code) is executed for that session. I can get multiple requests for the same session (read I can get new requests while the first is still processing) and they must execute one by one in correct order of arrival. So I seek a generic synchronization queue that can be use in such situations so requests can be queued. I have no control over the threads and each request may be executed in a different thread.

What is best (ususal) approach to this sort of problem? The problem is that Enqueue and Dequeue must be atomic opeations so that correct order is preserverd. My current implementation has a substantial bottleneck, but it works.

EDIT: Bellow is the problem of atomic Enqueue / Dequeue operations

You wold normaly do something like this:

procedure Enqueue;
begin
  EnterCriticalSection(FCritSec);
  try
    DoEnqueue;
  finally 
    LeaveCriticalSection(FCritSec);
  end;

  BlockTheCurrentThread; // here the thread blocks itself
end;

procedure Dequeue;
begin
  EnterCriticalSection(FCritSec);
  try
    DoDequeue;
    UnblockTheNextThread; // here the thread unblocks another thread
  finally 
    LeaveCriticalSection(FCritSec);
  end;
end;

Now the problem here is that this is not atomic. If you have one thread already in the queue and another one comes and calls Enqueue, it can happen, that the second thread will just leave the critical section and try to block itself. Now the thread scheduler will resume the first thread, which will try to unblock the next (second) thread. But second thread is not blocked yet, so nothing happens. Now the second thread continues and blocks itself, but that is not correct because it will not be unblocked. If blocking is inside critical section, that the critical section is never leaved and we have a deadlock.

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-13T07:34:54+00:00Added an answer on May 13, 2026 at 7:34 am

    Another approach:

    Let each request thread have a manual reset event that is initially unset. The queue manager is a simple object which maintains a thread-safe list of such events. The Enqueue() and Dequeue() methods both take the event of the request thread as a parameter.

    type
      TRequestManager = class(TObject)
      strict private
        fCritSect: TCriticalSection;
        fEvents: TList<TEvent>;
      public
        constructor Create;
        destructor Destroy; override;
    
        procedure Enqueue(ARequestEvent: TEvent);
        procedure Dequeue(ARequestEvent: TEvent);
      end;
    
    { TRequestManager }
    
    constructor TRequestManager.Create;
    begin
      inherited Create;
      fCritSect := TCriticalSection.Create;
      fEvents := TList<TEvent>.Create;
    end;
    
    destructor TRequestManager.Destroy;
    begin
      Assert((fEvents = nil) or (fEvents.Count = 0));
      FreeAndNil(fEvents);
      FreeAndNil(fCritSect);
      inherited;
    end;
    
    procedure TRequestManager.Dequeue(ARequestEvent: TEvent);
    begin
      fCritSect.Enter;
      try
        Assert(fEvents.Count > 0);
        Assert(fEvents[0] = ARequestEvent);
        fEvents.Delete(0);
        if fEvents.Count > 0 then
          fEvents[0].SetEvent;
      finally
        fCritSect.Release;
      end;
    end;
    
    procedure TRequestManager.Enqueue(ARequestEvent: TEvent);
    begin
      fCritSect.Enter;
      try
        Assert(ARequestEvent <> nil);
        if fEvents.Count = 0 then
          ARequestEvent.SetEvent
        else
          ARequestEvent.ResetEvent;
        fEvents.Add(ARequestEvent);
      finally
        fCritSect.Release;
      end;
    end;
    

    Each request thread calls Enqueue() on the queue manager and afterwards waits for its own event to become signalled. Then it processes the request and calls Dequeue():

    { TRequestThread }
    
    type
      TRequestThread = class(TThread)
      strict private
        fEvent: TEvent;
        fManager: TRequestManager;
      protected
        procedure Execute; override;
      public
        constructor Create(AManager: TRequestManager);
      end;
    
    constructor TRequestThread.Create(AManager: TRequestManager);
    begin
      Assert(AManager <> nil);
      inherited Create(TRUE);
      fEvent := TEvent.Create(nil, TRUE, FALSE, '');
      fManager := AManager;
      Resume;
    end;
    
    procedure TRequestThread.Execute;
    begin
      fManager.Enqueue(fEvent);
      try
        fEvent.WaitFor(INFINITE);
        OutputDebugString('Processing request');
        Sleep(1000);
        OutputDebugString('Request processed');
      finally
        fManager.Dequeue(fEvent);
      end;
    end;
    
    { TForm1 }
    
    procedure TForm1.Button1Click(Sender: TObject);
    var
      i: integer;
    begin
      for i := 1 to 10 do
        TRequestThread.Create(fRequestManager);
    end;
    

    The queue manager locks the list of events both in Enqueue() and in Dequeue(). If the list is empty in Enqueue() it sets the event in the parameter, otherwise it resets the event. Then it appends the event to the list. Thus the first thread can continue with the request, all others will block. In Dequeue() the event is removed from the top of the list, and the next event is set (if there is any).

    That way the last request thread will cause the next request thread to unblock, completely without suspending or resuming threads. This solution does also not need any additional threads or windows, a single event object per request thread is all that is needed.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 311k
  • Answers 311k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer The very first time that the page is load, then… May 13, 2026 at 10:23 pm
  • Editorial Team
    Editorial Team added an answer I think the error is with this line if (!$('#tagnames').val().length)… May 13, 2026 at 10:23 pm
  • Editorial Team
    Editorial Team added an answer You could also easily replace your for loop for a… May 13, 2026 at 10:23 pm

Related Questions

First of all, just grant that I do in fact want the functionality of
Let's say that I have a module that has a Queue in it. For
Currently, I have a RingBuffer which is run by a producer and a consumer
I have a script which needs to periodically start programs out of a array
I'm really not sure how to approach this, but I am subscribing to events

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.