I’ve got a text file, with about 200,000 lines. Each line represents an object with multiple properties. I only search through one of the properties (the unique ID) of the objects. If the unique ID I’m looking for is the same as the current object’s unique ID, I’m gonna read the rest of the object’s values.
Right now, each time I search for an object, I just read the whole text file line by line, create an object for each line and see if it’s the object I’m looking for – which is basically the most inefficient way to do the search. I would like to read all those objects into memory, so I can later search through them more efficiently.
The question is, what’s the most efficient way to perform such a search? Is a 200,000-entries NSArray a good way to do this (I doubt it)? How about an NSSet? With an NSSet, is it possible to only search for one property of the objects?
Thanks for any help!
— Ry
@yngvedh is correct in that an
NSDictionaryhas O(1) lookup time (as is expected for a map structure). However, after doing some testing, you can see thatNSSetalso has O(1) lookup time. Here’s the basic test I did to come up with that: http://pastie.org/933070Basically, I create 1,000,000 strings, then time how long it takes me to retrieve 100,000 random ones from both the dictionary and the set. When I run this a few times, the set actually appears to be faster…
In your particular case, I’m not sure either of these will be what you want. You say that you want all of these objects in memory, but do you really need them all, or do you just need a few of them? If it’s the latter, then I would probably read through the file and create an object ID to file offset mapping (ie, remember where each object id is in the file). Then you could look up which ones you want and use the file offset to jump to the right spot in the file, parse that line, and move on. This is a job for
NSFileHandle.