I have some very basic semaphore code that works great on Linux, but cannot for the life of me get it to run properly on OS X… It returns the oddest of results…
#include <iostream>
#include <fcntl.h>
#include <stdio.h>
#include <semaphore.h>
int main()
{
sem_t* test;
test = sem_open("test", O_CREAT, 0, 1);
int value;
sem_getvalue(test, &value);
printf("Semaphore initialized to %d\n", value);
}
Compiling this on OS X with g++ returns the following output:
iQudsi:Desktop mqudsi$ g++ test.cpp
iQudsi:Desktop mqudsi$ ./a.out
Semaphore initialized to -1881139893
Whereas on Ubuntu, I get the decidedly more-sane result:
iQudsi: Desktop mqudsi$ g++ test.cpp -lrt
iQudsi:Desktop mqudsi$ ./a.out
Semaphore initialized to 1
I’ve been at this for 3 hours straight, and cannot figure out why OS X is returning such bizarre results…
I’ve tried using file paths as the semaphore name, it didn’t make a difference.
I’d appreciate any help I could get.
You are using a function that is not currently implemented in Mac OS X, and the integer you are printing out contains the default data that the integer was initialised with which was probably random data that was still in memory. Had you zero’d it out, by setting it with
int value = 0;you might have caught this mistake sooner.This is the code I used (thanks to bdonlan):