I have a program which receives data from unmanaged code about 500 to 700 times per second. Some of this data is usefull and needs to be processed and some of it is useless and get disgarded right away. To find out if received data is usefull I use a List of strings.
My problem/question is: when I use a lock on the List to delete some or all entries, wil I get a big pileup of threads waiting to search the List?
Because deleting the entire list or parts of it is not used continuosly I now use a static Boolean. When I start with deleting I turn the Boolean to false and all data is disgarded before the list get searched. When I’m done I turn the Boolean back to true.
Is this a gooed workaround or is there a better one?
(I’m also asking this because testing is very time consuming at this point)
EDIT
The program is used to check if the strings in the list are correct. The unmanaged code sends data and this happens on a new thread. If the data is useful it gets displayed and the user can verify this. If the data turns out to be displayed but not usefull the user can delete the string from the List which happens on the main thread.
Yes – you may well get a “big pile up of threads”.
I would recommend looking into a lock with reader/writer semantics as opposed to a single “brutal” exclusive lock. This should enable many readers to read your data concurrently. Only when a writer comes along to update the data will an exclusive lock be taken. Provided the number of writes is low in relation to the number of reads, you’ll have very few threads “backing up”.
The .Net
ReaderWriterLockSlimis one possibility, but I heartily recommend you look at theOneManyResourceLockfrom Jeffrey Richter’s Power Threading Library