-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MultiPolygon #54
Comments
I figured this out the hard way too... the process would run out of memory and I noticed a pattern that all the ones that failed had multiple polygons. So I added a condition to my program to check |
If the GeoJSON feature is of type "Polygon", you can just pass the coordinates to polylabel. If it's "MultiPolygon", you can loop through the coordinates array to find the polygon with the largest area, and then pass that to polylabel. I do it like this, using d3.geoArea to calculate the areas: function findPolylabel(feature){
let output = [];
if (feature.geometry.type === "Polygon"){
output = polylabel(feature.geometry.coordinates);
}
else {
let maxArea = 0, maxPolygon = [];
for (let i = 0, l = feature.geometry.coordinates.length; i < l; i++){
const p = feature.geometry.coordinates[i];
const area = d3.geoArea({type: "Polygon", coordinates: p})
if (area > maxArea){
maxPolygon = p;
maxArea = area;
}
}
output = polylabel(maxPolygon);
}
return output;
} |
In my implementation, I plan to loop through all polygons, finding the most-interior point of all, and choosing the point with the greatest interior room as the winner. |
That algorithm has to be out there somewhere though, as MultiPolygons are successfully handled by the default OSM Mapnik style labels. |
I think this is less a matter of computation, than of taste or preference. i would propose supporting multipolygons with two options:
|
(enhancement)
As the GeoJSON format is concerned, I figure it out that "type":"MultiPolygon" is not supported. I suppose this would take an other kind of algorithm (idk?). Anyway, thanks for sharing !
The text was updated successfully, but these errors were encountered: