b-jazz's Comments
Post | When | Comment |
---|---|---|
Workflow for adding surface information collected during bike rides | I went on a road trip recently and had this exact same problem/project come to mind. Thanks for the excellent write-up. I have a few comments:
|
|
Adding Microsoft Building Footprints To OSM With MapRoulette: Why And How | I see a lot of building errors when going through and cleaning up things that OSM Inspector points out. I’m 95% positive these are from the Microsoft data, but I haven’t been able to conclusively prove it. I also haven’t been able to track anyone down that might be able to correct the area and update the shapefile dump so these errors don’t continue to show up. |
|
OpenStreetMap is currently free from duplicate nodes | I do these when I find them as well. I also clean up where two different node IDs share the same exact lat/long pair. I have some scripts set up that will download the data from OSM Inspector and then parses it up into small areas so that I can work on several duplicates in a single changeset. Thanks for helping with the cause. |
|
Analysis of Bounding Box Sizes Over the Last Eight Years | @Jennings: oh man, I wish I knew about that Amazon/Athena store before I started this. That might have been a big help. I’ll have to store that one away for any future plans. Thanks! I was thinking of doing the What I’d really like to do is figure out the “empty space” of the bounding box if you consider the bounding boxes of the individual changes of the set. But I’d need to dig further and find the objects/bbox of the individual objects, which isn’t available in my data. If I were to have my ideal feature in an editor that would warn me that my bounding box is too large, I’d really want it to warn me when the empty space is too large. If I edit a single way that is massive (the Bermuda Triangle for example), the overlapping bbox of the three ways would pretty much fill the bbox of the changeset. At least that should be one factor to consider when deciding whether or not the “hassle” the user with a warning. As for bots, do you have a dataset of bot usernames/userids? |
|
Analysis of Bounding Box Sizes Over the Last Eight Years | @PierZen: thanks for the comments and your tweet with another view of changeset analysis. Neat stuff. |
|
Analysis of Bounding Box Sizes Over the Last Eight Years | @imagico: I added a third heatmap with the perimeter length of the bounding box. There are fewer buckets (keeping with earlier use of doubling the bucket size on ever iteration) so the “heat” looks a little more condensed, but turns up some slight variations in the pattern. It’s an interesting take. Thanks for the suggestion. |
|
Analysis of Bounding Box Sizes Over the Last Eight Years | @imagico. Yes, yes. I see what you’re saying. Thanks for the comments! I’ll look into doing another heatmap with this idea. |
|
Analysis of Bounding Box Sizes Over the Last Eight Years | @tyr_asd I think I did start with that data but learned that it either didn’t include the bounding box or organize the changes into changesets. There was some reason I didn’t use it, but it’s not worth another 3.5GB download to figure out why. :) |
|
Analysis of Bounding Box Sizes Over the Last Eight Years | @imagico To calculate area, I use PostGIS’s ST_AREA() function. Over the past year, my sampling (roughly 1/13th of all changesets) has 829 records that are over 2^40th square meters. I’m not a statistician and can’t speak to how representative sampling is when it comes to rare events (0.06% in this case), but I can see it being inaccurate in either direction. |
|
Analysis of Bounding Box Sizes Over the Last Eight Years | Looking at my (sampled) data, there were 760 changesets with only two objects modified that ended up making for a changeset bounding box of greater than 1,000 square kilometers. The sampling factor is roughly 1:13, so that extrapolates to 10,000 very large changesets in a year from just two changes (likely two nodes from a few that I looked at by hand). |
|
Improving the Behavior of Search Engine Optimizer (SEO) Companies | Thanks @aharvey. For others, the video can be found at https://www.youtube.com/watch?v=BovbAIIJ6L8 For us, the hardest part was trying to block a moving target. Since a new account was added for every single POI they created. It was a lot of effort to chase down the source of the edits, but I think it was worth it in the end. I’d suggest starting there. And if/when you do find the people responsible, treat them with respect and understand where they are coming from and try to sell them on a win/win solution. |
|
HTTPS All The Things (https_all_the_things) | I’m refactoring the code as we speak to apply to a broader set of tags. I expect I’ll start a run of those in the next week or two. And yes, you’re right that the whole planet has been completely looking for the website key. I’m just rolling across the entire planet about once a week looking for new additions. |