Lògo d'OpenStreetMap OpenStreetMap

Osmarender - backlog?

Postat per Donald Allwright a 16 de mai 2008 en English

I've noticed this week that changes don't seem to be appearing in the osmarender layer. Looking at the server status pages it looks like there is a large backlog (30,000 requests). There are also some changes I made a few weeks ago which still aren't rendered at all at lower resolutions. Is it the case that we have reached the point where there are simply more people editing openstreetmap.org, but without a corresponding increase in the number of people willing to render tiles? If so, what can be done about it? Do we just need more people to volunteer their CPU time to keep up?

I noticed that Deelkar expressed some concerns about osmarender a few weeks ago, especially concerning the hardware requirements for rendering the more complicated tiles - these can take up to 12GB of RAM to render, which is more than most people have available. It seems that the osmarender infrastructure is creaking at the seams, I'm not one of the developers but wonder what (if anything) I can do to help. I find osmarender really useful as a preliminary check of my edits, it usually flags up a few issues that require tweaking before Mapnik renders those changes on a Wednesday.

Email icon Bluesky Icon Facebook Icon LinkedIn Icon Mastodon Icon Telegram Icon X Icon

Discussion

Comentari de davidearl lo 16 de mai 2008 a 20:30

If you go to www.informationfreeway.org, zoom to level 12, and CTRL+CLICK on the tiles in the area you have changed, you get a higher priority manual request for that area. My changes yesterday over 6 tiles rendered within a couple of hours.

Comentari de Malcolm Boura lo 17 de mai 2008 a 12:09

12GB of RAM to render implies a flawed implementation and the problem will almost certainly get worse as the coverage and detail increases. Unfortunately I don't have the time available to be able to offer to take a look at it.

Comentari de HannesHH lo 17 de mai 2008 a 13:04

I would happily let my PC render tiles if the stuff was easy to setup (and not needing to install so many things). The Virtualbox machine would be helpful (and of course being able to use the same login): osm.wiki/index.php/Virtual_Tiles%40Home

Comentari de chasu lo 17 de mai 2008 a 13:26

Please have a look here: http://munin.openstreetmap.org/openstreetmap/tah.openstreetmap-tah_queue.html , it seems that the bottleneck is not the cpu power of the clients, but the network bandwidth of the server. Therefore more CPUs won't help for that problem, unless the server get's a higher bandwidth.

Let me know, if I'm wrong here.

Comentari de HannesHH lo 17 de mai 2008 a 13:41

Oh actually it seems installation was a breezy (on Debian). :)

Comentari de tyll lo 17 de mai 2008 a 15:03

Afaik the bottleneck is at the server, but it is not the bandwidth. The problem seems to be that the servers takes too long to process uploaded jobs. At http://tah.openstreetmap.org/Upload/go_nogo.php you can see the queue length, whis is the number of uploads that are not yet processed by the server but already transfered.

Comentari de Donald Allwright lo 17 de mai 2008 a 17:43

I have downloaded and installed the necessary packages and it seems fairly straight forward, although as it's on a PC that's not that well suited to the workload I haven't yet requested an Upload password. Also if as chasu suggests it's not the real problem, then it won't make the slightest bit of difference. I had a look at osmarender last night - was rendering some experimental changes locally, this is probably the best way of achieving what I originally wanted, i.e. to make sure that changes render OK). However the bottleneck remains. If it's at the server there is probably little that I personally can do. Would require some software optimisations on the server, more expensive hardware, or a re-archtecture to allow the load to be shared among multiple servers.

Comentari de andrewpmk lo 17 de mai 2008 a 22:01

My view - start rebuilding the Mapnik database more often (ideally daily). Then there will be little need for Tiles@home, which is clearly terribly inefficient. Mapnik is much more efficient - after all it all runs on one server. Of course, if someone can figure out how to greatly reduce T@H's CPU and RAM consumption, go ahead...

Comentari de Deelkar lo 18 de mai 2008 a 22:50

The 12 GB figure *is* a tile basically completely mapped out. however the amount of such tiles will grow in the future.
The basic bottleneck of t@h, next to the server, is the transformation from svg->png. There are simply no light-weight but sufficiently feature-complete working svg rasterizers.

Connectatz-vos per ajustar un comentari