To begin with, most browsers can open multiple download threads to the same destination (IE uses 2, Firefox uses 4). This is not a problem, but its good to know. Then there is a TCP start/stop overhead, impact of which can be minimized by using large files and enabling keepalive. The biggest problem however is caching intelligence within the browser which can trick detection logic to think that it has a superfast network connectivity. The same problem can also confuse multiple browsers behind a caching proxy server.
The solution to all of these problems are relatively simple. First of all use multiple file downloads to maximize the usage of all the browser threads to the server. Enable Keepalives on the server to minimize TCP restart overheads. Use relatively large files for sampling and finally use random numbers as URL parameters to force the cache to discard previous version of the file from cache "?randomnumbers"
Here is the version of Bandwidth Tester I implemented for huntip
Feel free to download, modify and use the source from here. I would appretiate if you link back or update me with any enhancements you make over this.