Hey ! A was (hardly) contributing this afternoon but now I receive http 509 when I use Potlatch2. My request was : <pre>http://www.openstreetmap.org/api/0.6/map?bbox=%2D3%2E9156027138233185%2C47%2E87156539324182%2C%2D3%2E913097530603409%2C47%2E87303352924003 And the Response header is : (Status-Line) HTTP/1.1 509 Bandwidth Limit Exceeded Date Sun, 30 Jan 2011 17:00:49 GMT Error You have downloaded too much data. Please try again later. Vary Accept-Encoding Content-Encoding gzip Content-Length 133 Keep-Alive timeout=15, max=100 Connection Keep-Alive Content-Type text/html Is that a new feature ? asked 30 Jan '11, 17:18 Marcussacapu... TomH ♦♦ |
The bandwidth limitation depends on how much data is downloaded by the same IP number in relatively short timespan. It is difficult, but not impossible, to trigger this with P2. The most likely cause would be if you were working on a relatively low zoom level (with lots of data visible), and were panning around a lot. It is also possible that you were using a proxy server, or masqueraded internet access so that others seemed to be using the API at the same time as you did. Bandwidth limitation is a relatively new feature, and will certainly still have to be fine-tuned to avoid "false positives" like this; sadly, such a limitation is required as we have a number of people who disregard the API Usage Policy and download an excessive amount of data, slowing down the API for the rest of us. answered 30 Jan '11, 18:32 Frederik Ramm ♦ 6
Thank you together for your answers. I suggest this http header should produce a nicer message than "can't load the map" ; I propose some thing like : "go outside and make a survey ! You stay too long behing your computer"
(30 Jan '11, 21:21)
Marcussacapu...
|
The bandwidth limit has been in place for many months now. It is only triggered for a very small number of users though - basically you have to download a large amount of data (hundreds of megabytes) in a short space of time. answered 30 Jan '11, 18:30 TomH ♦♦ 2
This limitation frustrates me too. I need to download larger amounts of data because I am importing data from Corine Land Cover 2006 and this covers large areas. This limiter should adjust the limit according to uploads :-( Please whitelist me - username: *Martin*
(08 Feb '11, 20:53)
Martin
Patches welcome. As things stand we have no capability to whitelist anybody, and certainly not by username given that there is no requirement to be logged in to download data.
(08 Feb '11, 21:00)
TomH ♦♦
I can workaround this limitation by setting up a HTTP proxy on a remote server. But I don't wanna do this. As limitation is not based on the username but rather on IP address, please whitelist mine: 88.212.34.59 Thanks.
(10 Feb '11, 17:49)
Martin
4
We do not have any ability to whitelist anything - there is literally no code to do that. Deliberately attempting to evade limits is quite likely to result in additional measures being taken. The limits we put in place are not done for fun - they are to try and ensure that everybody gets a fair go and reasonable performance.
(10 Feb '11, 18:17)
TomH ♦♦
3
I would also add that this is not the correct place to debate policy. A question was asked here and it has been answered - if you have a problem with the policy please take it to a mailing list or address it directly with the technical working group.
(10 Feb '11, 18:18)
TomH ♦♦
|
Make sure you are zoomed in a long way before starting to use Potlatch 2. answered 30 Jan '11, 19:30 Richard ♦ |
If you're working with this much data, rather than using Potlatch, which tends to play fast and loose with requests in an effort to make it easy to make quick or minor edits, it may be better to use JOSM or Merkaartor. Both only download data from the API when explicitly told to do so, and bring a lot more features to the table that make them suitable for heavy-duty editing. answered 10 Feb '11, 19:06 Baloo Uriza |
Don't forget to accept an answer (the round checkmark button).
When will we able to download data again if we triggered this limit? I am editing pipelines in Russia with Go Map and it downloads data of all surrounding features. As the pipelines extend for thousands of Kilometers, I seem to have exceeded the limit. Thanks!