I have experimented with a technique that involves a web page request which you use to calculate the bit-rate based on bytes divided by the time elapsed. You can average multiple data points of course, but is this as accurate of a bit-rate estimation that can be made?
Do any professional or less hackish techniques exists? (Or is this just one of the magical mysteries of the internet?)
This really depends what kind of speed information you are interested in. You must be aware of several things:
My suggestion is to make requests to a fair amount of unique sites for it to test. [Within an accetable margin of error]
Test lots of them and take the average speed, that is what you can generally expect from a client. The max is an estimated maximum that a client can get. The minimum that a client can guarentee is 0mbs.