Feed aggregator

Jackie Ng: Announcing: mapguide-react-layout 0.11

Planet OSGeo feeds - Tue, 02/20/2018 - 18:10
Here's a long overdue release of mapguide-react-layout with a ton of new features.

GIFs and screenshots galore below. Brace yourselves!

Adding external WMS layers

A new component is available to easily add external WMS layers to your map.



As previously mentioned, you can access this component through a InvokeURL command that uses a component://AddManageLayers URI instead of a normal URL.

Partial Application State now encoded in URL

Through the use of react-url-query, we now have the ability to transfer part of our internal application state into the URL, allowing us to reload the browser window and be able to resume from where we left off.


The following bits of application state are now encoded into the URL:
  • Current view (x/y/scale)
  • Current active map (for multiple map configurations)
  • Shown/Hidden layer and group names
  • Current session id
Share Link to View

A new component is available to easily share this newly stateful URLs to other users.


Once again, you can access this new feature through an InvokeURL that has a component://ShareLinkToView URI instead of a normal URL. The session id is omitted by default, but you can tick Include Session ID to include your session id

Measure Segment Display

This was a feature missing from the equivalent Fusion widget that has finally been ported over. The measure tool properly displays individual measured segments.


Segments are only displayed for geodesic measurements. 

Drawing/Digitization improvements

Digitization now supports easily undoing the most recently drawn point by pressing the 'U' key.


This key and the key for cancelling the digitization (ESC) can be bound to different keys through new viewer mount options.

Template Improvements

The ajax-viewer template has been tweaked to look more like the original

This is possible because of the use of react-splitter-layout that gives us resizable panels.

The slate, maroon, limegold and turquoiseyellow templates also use this new component to finally give us resizable sidebars.


HTML property values in Selection Panel

Through new mount options, you can specify property values that are HTML to be presented as actual HTML, meaning property values that used to look like this:


Now looks like this:


This feature is disabled by default and you must opt into it (through new mount options). If the HTML content you're presenting is potentially un-trusted you can provide a sanitization function to clean the provided content (for example, with DOMPurify) to guard against possible cross-site scripting attacks.

Other Changes
  • Updated React to 16.2
  • Updated Blueprint to 1.35.5
  • Updated OpenLayers to 4.6.4
  • Updated TypeScript to 2.7.2
  • Debug viewer bundle (viewer-debug.js) now included
  • Added support for CenterSelection fusion widget. This marks the end of the Fusion widget porting work. No further Fusion widgets will be ported across. The list of Fusion widgets that will not be ported across can be found here.
  • Added support for extra extension properties in Redline and SelectPolygon widgets
  • Flyouts are now mutually exclusive, behaving more like their Fusion counterparts
  • Broke/404 toolbar/menu icons now gracefully show error icon as a placeholder.
  • Added support for manual feature tooltip toggling. This replaces click-based map selection if active and can be controlled through the Viewer Options UI.
  • Fix: Initial view of the map not using full viewport.
  • Fix: Commands now fall back to running in modal dialog if Task Pane is not present.
  • AJAX map frame viewer API is now fully emulated.

Project Home Page
Download
mapguide-react-layout on npm

GeoServer Team: GeoServer 2.11.5 released

Planet OSGeo feeds - Tue, 02/20/2018 - 18:02

We are happy to announce the release of GeoServer 2.11.5. Downloads are available (zipwar, and exe) along with documentation and extensions (OSX installer is currently missing as we’re unable to generate a signed installed version due to security/infrastructure issues being discussed on geoserver-devel).

GeoServer 2.11.5 is the last maintenance release of the GeoServer 2.11.x series, so we recommend users to plan an upgrade to 2.12.x or to the upcoming 2.13.x series. This release is made in conjunction with GeoTools 17.5.

Highlights of this release are featured below, for more information please see the release notes (2.11.5 | 2.11.4 | 2.11.3 |  2.11.22.11.12.11.0 | 2.11-RC1 | 2.11-beta ).

Bug Fixes
  • Fixed GetFeatureInfo on rasters setup with “reproject to declared” SRS policy
  • Assorted fixes on demo request page (password was not being sent
  • Allow importer to handle multi-coverage files on import (NetCDF)
  • GetLegendGraphic fixes for cut symbols on rescale (happened with large symbols and odd sized legends)
  • WMS fixes on rendering rasters whose native CRS is a polar stereographic
  • And several more, check the release notes for full details
About GeoServer 2.11

Articles, docs, blog posts and presentations:

  • OAuth2 for GeoServer (GeoSolutions)
  • YSLD has graduated and is now available for download as a supported extension
  • Vector tiles has graduate and is now available for download as an extension
  • The rendering engine continues to improve with underlying labels now available as a vendor option
  • A new “opaque container” layer group mode can be used to publish a basemap while completely restricting access to the individual layers.
  • Layer group security restrictions are now available
  • Latest in performance optimizations in GeoServer (GeoSolutions)
  • Improved lookup of EPSG codes allows GeoServer to automatically match EPSG codes making shapefiles easier to import into a database (or publish individually).

Jackie Ng: React-ing to the need for a modern MapGuide viewer (Part 20): It doesn't have to *specifically* be a URL

Planet OSGeo feeds - Tue, 02/20/2018 - 16:51
Wow! It's been 5 months since the last post related to mapguide-react-layout.

No I haven't stopped development. There's still plenty of things I want to achieve with this project before I consider it "done". It just had to take a back seat to other things that needed some attention:
Now that things have calmed down a bit, before I talk about the new release of mapguide-react-layout that will drop real soon, I want to talk about a capability of mapguide-react-layout that has been present for some time now, but I want to dedicate this entire post to because understanding this capability is the key in knowing how we can add new features to mapguide-react-layout yet retain the ability to use our existing Web Layout and Application Definitions to reference such new features.

I'm sure most of you should already be familiar with InvokeURL commands. You can define these commands in both a Web Layout and an Application Definition. They get represented in the AJAX/Fusion viewer as toolbar or menu items. When you click them, they invoke the specified URL into a given target (the Task Pane, a new window or a specific frame). InvokeURL commands is the basic gateway to custom MapGuide viewer functionality.

InvokeURL commands are also supported in mapguide-react-layout, except we can do extra things with such commands in mapguide-react-layout by leveraging the fact that a URL is merely a type of URI (Uniform Resource Identifier) and a URI ... does not necessarily have to start with http://. MapGuide resource identifiers (eg. Library://Foo/Bar.MapDefinition) are basically like URIs so this shouldn't be too much of a foreign concept.

So what does mapguide-react-layout do with this fact?

To demonstrate, here's a new feature that will be in the next release of mapguide-react-layout: A component to easily add external WMS layers


How can you access this new feature? Through an InvokeURL command.

Except, you don't invoke a URL you invoke a URI ... of our own custom creation.


As you can see from the above screenshot, we've invented our own component:// URI scheme to easily tap into new features in mapguide-react-layout without any structural changes required in the Web Layout or Application Definition schemas. We just leverage the existing InvokeURL command support and leverage the fact that whatever we're invoking in mapguide-react-layout ... doesn't necessarily have to be a URL. component URIs are structured as follows:

component://[component name][?query]
A component name refers to any component that's registered in the component registry. If a component can take parameters, you can pass them through the query string. Several components are available in the standard viewer bundle. The list of accessible components is available here.
If you use the npm module to roll your own custom viewer bundle, we provide APIs to allow you to easily create your own custom components. This is demonstrated in this example project.
The list of components available is small so far, that will grow over time as we add new features in future releases of mapguide-react-layout and the best thing about this is that to take advantage of such features, no changes to any authoring tools is required. You just have to enter a different kind of URI instead of a URL.

    GeoTools Team: GeoTools 17.5 released

    Planet OSGeo feeds - Mon, 02/19/2018 - 16:11
    The GeoTools team is pleased to announce the release of GeoTools 17.5:
    This release, which is also available from the GeoTools Maven repository, is made in conjunction with GeoServer 2.11.5.

    GeoTools 17.5 is the last maintenance release of the 17.x series, time to consider upgrading!
    The release mainly fixes bugs but also includes some enhancements: 
    • Allow group extraction in image mosaic regular expression based property collector
    • Allow overriding LabelCache in renderer
    • Allow MetaBufferEstimator to work against a specific feature (to calculate sizes based on attribute values)
    For more information please see the release notes (17.517.4 | 17.3 | 17.2 | 17.1 | 17.0 | 17-RC1 | 17-beta).
    About GeoTools 17
    • The wfs-ng module has now replaced gt-wfs.
    • The NetCDF module now uses NetCDF-Java 4.6.6.
    • Image processing provided by JAI-EXT 1.0.15.
    • YLSD module providing a plain-text representation of styling.
    Upgrading
    • The AbstractDataStore has finally been removed. Please transition any custom DataStore implementations to ContentDataStore (tutorial available).

    gvSIG Team: GIS applied to Municipality Management: Module 13 ‘Layouts’

    Planet OSGeo feeds - Mon, 02/19/2018 - 11:49

    The video of the thirteenth module is now available, in which we will show how to create maps with the geographic information that we have in our views.

    The layout will be the document that we can print, or export to PDF or PostScript, and in which we will insert the views that we have created in our project.

    On the layout we can insert all type of elements, such as texts, north arrow, scale, legend, images or logos, charts, rectangles, lines…

    The cartography to follow this video can be downloaded from this link.

    Here you have the videotutorial of this new module:

    Related posts:

    Just van den Broecke: Emit #3 – Things are Moving

    Planet OSGeo feeds - Sat, 02/17/2018 - 00:03

    This is Emit #3, in a series of blog-posts around the Smart Emission Platform, an Open Source software component framework that facilitates the acquisition, processing and (OGC web-API) unlocking of spatiotemporal sensor-data, mainly for Air Quality. In Emit #1, the big picture of the platform was sketched. Subsequent Emits will detail technical aspects of the SE Platform. “Follow the data” will be the main trail loosely adhered to.

    Three weeks ago since Emit #2. A lot of Things have been happening since:

    A lot to expand on. Will try to briefly summarize on The Things Conference, LoRA and LoRaWAN and save the other tech for later Emits.

    LoRA and LoRaWAN may sound like a sidestep, but are very much related to for example building a network of Air Quality and other environmental sensors. When deploying such sensors two issues always arise:

    • power: need continuous electricity to keep sensors and their computing boards powered
    • transmission: need cabled Internet, WIFI or cellular data to transmit sensor-data

    In short, LoRa/LoRaWAN (LoRa=Long Range) is basically a wireless RF technology for long-range, low-power and low-throughput communications. You may find many references on the web like from the LoRa Alliance and SemTech. There is lots of buzz around LoRa. But just like the, Wireless Leiden project , who built a public WIFI network around the city, The ThingsNetwork has embraced LoRa technology to build a world-wide open, community-driven, “overlay network”:

    “The Things Network is building a network for the Internet of Things by creating abundant data connectivity, so applications and businesses can flourish. The technology we use is called LoRaWAN and it allows for things to talk to the internet without 3G or WiFi. So no WiFi codes and no mobile subscriptions. It features low battery usage, long range and low bandwidth. Perfect for the internet of things.”

    You may want to explore the worldwide map of TTN gateways below.

    And The ThingsNetwork (TTN) was established in Amsterdam, The Netherlands. As an individual you can extend The Things Network by deploying a Gateway. Via the TTN KickStarter project, I was one of the first backers, already in 2015. The interest was overwhelming, even leading to (Gateway) delivery problems. But a LoRa Gateway to extend TTN is almost a commodity now. You can even build one yourself. TTN is very much tied to the whole “DIY makers movement”. All TTN designs and code (on GitHub) are open. Below a global architecture picture from their site.

     So TTN organized their first conference, off course in Amsterdam. For three days: it was an amazing success, more than 500 enthousiasts.

    The conf was very hands-on with lots of free workshops (with free takeaway hardware). Followed several workshops, which were intense (hardware+software hacking) but always rewarding (blinking green lights!). One to mention in particular (as a Python programmer) was on LoPy a sort of Arduino board, very low cost (around $30), programmable with MicroPython that connects directly to TTN. Within an hour the board was happily sending meteo-data to the TTN.

    All in all this conference made me eager to explore more of LoRA with TTN, in particular to explore possibilities for citizen-based sensor-networks for environmental, in particular air quality, data. I am aware that “IoT” has some bad connotations when it comes to security, especially from closed technologies. But IoT is a movement we cannot stop. With and end-to-end open technology like the TTN there is at least the possibility to avoid the “black box”-part and take Things in our own hand.

     

     

     

    gvSIG Team: Recording of webinar on “gvSIG Suite: open source software for geographic information management in agriculture” is now available

    Planet OSGeo feeds - Fri, 02/16/2018 - 10:33

    If you weren’t be able to follow the webinar on “gvSIG Suite: open source software for geographic information management in agriculture”, organized by GODAN and gvSIG Association, you can watch the recording now at the gvSIG Youtube channel:

    The webinar was oriented to show the gvSIG Suite, a complete catalog of open source software solutions, applied to agriculture.

    The gvSIG Suite is composed of ‘horizontal’ products:

    • gvSIG Desktop: Geographic Information System for editing, 3D analysis, geoprocessing, maps, etc
    • gvSIG Online: Integrated platform for Spatial Data Infrastructure (SDI) implementation.
    • gvSIG Mobile: Mobile application for Android to take field data.

    and sector products:

    • gvSIG Roads: Platform to manage roads inventory and conservation.
    • gvSIG Educa: gvSIG adapted to geography learning in pre-university education.
    • gvSIG Crime: Geographic Information System for Criminology management.

    At the webinar we also showed several successful case studies in agriculture and forestry. You also can consult another case studies at gvSIG Outreach website in this sector and other related sectors (they are in their original language but there’s a translator at the left side):

    The presentation is available at this link.

    If you want to download gvSIG Desktop you can do it from the gvSIG website, gvSIG Mobile is available from the Play Store, and if you are interested in implementing gvSIG Online in your organization you can contact us by e-mail: info@gvsig.com.

    If you have any doubt or problem with the application you can use the mailing lists.

    And here you have several links about training on gvSIG, with free courses:

    gvSIG Team: GIS applied to Municipality Management: Module 12 ‘Geoprocessing’

    Planet OSGeo feeds - Thu, 02/15/2018 - 09:52

    The video of the twelfth module is now available, in which we will see the geoprocessing tools in gvSIG.

    gvSIG has more than 350 geoprocesses, both for raster and vector layers, which allow us to perform different types of analysis, for example to obtain the optimal areas to locate a specific type of infrastructure.

    Using the geoprocesses that are available in gvSIG we can create buffers for example, to calculate, among other things, the roads or railways rights of way. Then an intersection can be applied with a layer of parcels to obtain which part of each parcel should be expropriated. We can also make hydrological analysis, merge layers…

    The cartography to follow this video can be downloaded from this link.

    Here you have the videotutorial of this new module:

    Related posts:

    GeoSolutions: Latest on GeoNode: Local Monitoring Dashboard

    Planet OSGeo feeds - Wed, 02/14/2018 - 17:51

    Dear All,

    in this post we would like to introduce a plugin which we have developed for GeoNode and released it as Open Source (full documentation available here) in order to give users the ability to to keep under control hardware load as well as software load on a GeoNode installation. This plugin is called Monitoring, it is available as an Open Source contrib module for GeoNode, documentation on how to enable it and configure it, can be found here.

    Overview of the monitoring capabilities

    We are now going to provide an overview of the functionalities provided by this plugin; it is worth to point out that given the sensitive nature of the information to show, it is accessible only to GeoNode users that have the administrator role.

    The plugin allows administrators to keep under control the hardware and software load on a GeoNode instance by collecting, aggregating, storing and indexing a large number of informations that we normally keep hidden and spread in various logs which are difficult to find when troubleshooting, like GeoNode's own log, GeoServer log and audit log and so on; in addition, we collect also information about hardware load on memory and cpu (disk could be added easily) which are important to collect in live instances.

    Possibility to create alerts that can control certain conditions and then send a notification email to preconfigured email addresses is also available (more on this here).

    It is also possible to look at OGC Service statistics on a per service and per layer basis. Eventually, a simple country-based map that shows where requests are coming from is available.

    Overview of the available analytics

    Let us now dive a little into the functionalities provided by this plugin. Here below you can see the initial homepage of the plugin.

    [caption id="attachment_3878" align="alignnone" width="1024"] Monitoring Plugin Homepage[/caption] We tried to put on the homepage a summary of the available information so that a user can quickly understand what is going on. The first row provides a series of visual controls that give an overview of the instance's health at different aggregation time ranges (from 10 mins to 1 Week):
    • Health Check - tells us if there are any alerts or erros that would require the attention of the administrator. Colors range from Red (at least an Error has happened in the selected time range) to Yellow (no Errors but at least an Alert has triggered within the selected time range)  and finally to Green (no Alerts or Errors).
    • Uptime - shows GeoNode system uptime.
    • Alerts -  shows number of notifications from defined checks. When clicked, Alerts box will show detailed information . See Notifications description for details.
    • Errors - shows how many errors were captured during request processing. When clicked, Errors box will show detailed list of captured errors.
    The Software Performance section is responsible for showing analytics about the overall performance of both GeoNode itself as well as the OGC back-end. [caption id="attachment_3892" align="alignnone" width="400"] Software Performance Summary Dashboard[/caption]

    If we click on the upper right corner icon, the detailed view will be shown, as illustrated below. Additional detailed information over the selected time period will be shown for both GeoNode, OGC Services and then for individual layers and resources.

    [caption id="attachment_3906" align="alignnone" width="1024"] Software Performance Detailed Dashboard - 1[/caption] [caption id="attachment_3907" align="alignnone" width="1024"] Software Performance Detailed Dashboard - 2[/caption]

    The Hardware Performance section is responsible for showing analytics about CPU and Memory usage of the machine where GeoNode runs as well as of the machine where GeoServer runs (in case it runs on a separate machine); see figure below.

    [caption id="attachment_3901" align="alignnone" width="1024"] Hardware Performance Detail Section[/caption]

    Interesting points and next steps

    The plugin provides additional functionalities which are described in detail in the GeoNode documentation (see this page); as mentioned above we can inspect errors in the logs directly from the plugin, we can set alerts that would send notifications email when they trigger. Moreover, the plugin provides a few additional endpoints that make it easier to monitor a GeoNode instance from the GeoHealthCheck Open Source project (as explained here).

    [caption id="attachment_3912" align="alignnone" width="1024"] Inspecting the error log[/caption]

    Last but not least, we would want to thank the GFDRR group at the World Bank which provided most of the funding for this work.

    If you are interested in learning about how we can help you achieving your goals with open source products like GeoServerMapStoreGeoNode and GeoNetwork through our Enterprise Support Services and GeoServer Deployment Warranty offerings, feel free to contact us!

    The GeoSolutions team,

    CARTO Inside Blog: CARTO Core Team and 5x

    Planet OSGeo feeds - Wed, 02/14/2018 - 11:00

    CARTO is open source, and is built on core open source components, but historically most of the code we have written has been in the “CARTO” parts, and we have used the core components “as is” from the open source community.

    While this has worked well in the past, we want to increase the velocity at which we improve our core infrastructure, and that means getting intimate with the core projects: PostGIS, Mapnik, PostgreSQL, Leaflet, MapBox GL and others.

    Our new core technology team is charged with being the in-house experts on the key components, and the first problem they have tackled is squeezing more performance out of the core technology for our key use cases. We called the project “5x” as an aspirational goal – can we get multiples of performance improvement from our stack? We knew “5x” was going to be a challenge, but by trying to get some percentage improvement from each step along the way, we hoped to at least get a respectable improvement in global performance.

    Our Time Budget

    A typical CARTO visualization might consist of a map and a couple widget elements.

    The map will be composed of perhaps 12 (visible) tiles, which the browser will download in parallel, 3 or 4 at a time. In order to get a completed visualization delivered in under 2 seconds, that implies the tiles need to be delivered in under 0.5s and the widgets in no more than 1s.

    Ideally, everything should be faster, so that more load can be stacked onto the servers without affecting overall performance.

    The time budget for a tile can be broken down even further:

    • database retrieval time,
    • data transit to map renderer,
    • map render time,
    • map image compression, and
    • image transit to browser.

    The time budget for a widget is basically all on the database:

    • database query execution, and
    • data transit to JavaScript widget.

    The project goal was to add incremental improvements to as many slices as possible, which would hopefully together add up to a meaningful difference.

    Measure Twice, Cut Once

    In order to continuously improve the core components, we needed to monitor how changes affected the overall system, against both a long-term baseline (for project-level measurements) and short-term baselines (for patch-level measurements).

    To get those measurements, we:

    • Enhanced the metrics support in Mapnik, so we could measure the amount of time spent in retrieving data, rendering data, and compressing output.
    • Built an internal performance harness, so we can measure the cost of various workloads end-to-end.
    • Carried out micro-benchmarks of particular workloads at the component level. For PostGIS, that meant running particular SQL against sample data. For Mapnik that meant running particular kinds of data (large collections of points, or lines, or polygons) through the renderer with various stylings.

    Using the measurements as a guide, we then attacked the performance problem.

    Low Hanging Fruit

    Profiling and running performance tests, and doing a little bit of research showed up three major opportunities for performance improvements:

    • PostgreSQL parallelism was the biggest potential win. With verion 10 coming out shortly, we had an opportunity get “free” improvements “just” by ensuring all the code in CARTO was parallel safe and marked as such. Reviewing all the code for parallel safety also surfaced a number of other potential efficiency improvements.
    • Mapnik turned out to have a couple areas where performance could be improved, through caching features rather than re-querying, and in improving the algorithms used for rendering large collections of points.
    • PostGIS had some small bottlenecks in the critical path for CARTO rendering, including some inefficient memory handling in TWKB that impacted point encoding performance.

    Most importantly, during our work on core code improvements, we brought all the core software into the CARTO build and deployment chain, so these and future improvements can be quickly deployed to production without manual intervention.

    We want to bring our improvements back to the community versions, and at the same time have early access to them in the CARTO infrastructure, so we follow a policy of contributing improvements to the community development versions while back-patching them into our own local branches (PostGIS, Mapnik, PostgreSQL).

    And In the End

    Did we get to “5x”? No, in our end-to-end benchmarks, we notched a range of different improvements, ranging from a new percent to a few times, depending on the use cases. We also found our integration benchmarks were sensitive to pressure from other load on our testing servers, so relied mostly on micro-benchmarks of different components to confirm local performance improvements.

    While the performance improvements have been gratifying, some of the biggest wins have been the little improvements we made along the way:

    We made a lot of performance improvements across all the major projects in which CARTO is based upon: you may have already noticed those improvements. We’ve also shown that optimizations can only get you that so far – sometimes taking a whole new approach is a better plan. A good example of this is the vector and raster data aggregations work we have been doing, reducing the amount of data transfered with clever summarization and advanced styling.

    More changes from our team are still rolling out, and you can expect further platform improvements as time goes on. Keep on mapping!

    GeoNode: GeoNode Summit 2018

    Planet OSGeo feeds - Wed, 02/14/2018 - 01:00
    GeoNode Summit 2018 Join the awesome GeoNode community for the Summit 2018 from 26 to 28 March 2018 in the elegant city of Turin, Italy!

    Summit Website


    This Website is licensed CC by SA 2012. - GeoNode Contributors. Designed by Spatial Dev

    OSGeo.nl: Verslag: OSGeo.nl en OpenStreetMap NL Nieuwjaarsborrel 2018

    Planet OSGeo feeds - Tue, 02/13/2018 - 23:33

    Op zondag 14 januari 2018 werd de traditionele OSGeo.nl en OpenStreetMap NL nieuwjaarsborrel, in de inmiddels geheel gerenoveerde bovenzaal van Cafe Dudok in Hilversum, weer goed bezocht. Een van de weinige events waar deze twee communities tezamen komen (dat moeten we vaker doen!).

    Ieder jaar weer blijkt dat er bij deze communities weer interesses en raakvlakken zijn: niet alleen op gebied van de open data rond De Basis Registraties (BAG, BGT, BRK, Top10NL etc) en projecten die zich daar mee bezig houden zoals NLExtract, maar ook op het gebied van bijvoorbeeld Missing Maps en QGIS. Dit smaakt ook naar meer events in 2018 om gezamenlijk Open Source en Open Data voor geo-informatie in Nederland te versterken.

    Onder de aanwezigen was duidelijk veel kennis en vooral het enthousiasme deze kennis te delen. Naast dat de bitterballen en speciaal-bieren weer goed smaakten waren er weer een aantal presentaties, aankondigingen en plannen. Meer hieronder, met links naar de slides:

    1. OSGeo.nl: Terugblik 2017, plannen 2018 – Just van den Broecke – Slides PDF
    Gert-Jan van der Weijden nam na 5 jaar uitmuntend OSGeo.nl voorzitterschap afscheid. Het huidige OSGeo.nl bestuur bestaat nu uit Just van den Broecke (voorzitter), Paulo van Breugel (secretaris), Barend Köbben (penningmeester). Maar bovenal was 2017 het jaar waarin de eerste FOSS4G NL plaatsvond in Groningen. Door de inzet van een geweldig team met o.a. Erik Meerburg, Leon van der Meulen, Willy Bakker en vele vrijwilligers van de Rijks Universiteit Groningen, werd dit event een enorm succes. Meer (vervolg) daarover later. En vooruitkijkend: welke ambities hebben we in 2018: teveel om op te noemen: vooral een FOSS4G NL 2018, maar ook gezien bijvoorbeeld een overweldigend aantal inschrijvingen op onze GRASS Crash Course, willen we in 2018 meer inzetten op kleinschalige, gerichte, hands-on events. Laat ons weten, als je daarvoor ideeën hebt.

    2. Raymond Nijssen – 510.global data team Rode Kruis – Slides PDF

    Raymond nam ons mee naar Sint Maarten in zijn, vaak aangrijpende, persoonlijke verhaal als vrijwilliger voor het 510.global data team van het Rode Kruis, waarvoor hij zich had aangemeld  kort na de orkaan Irma. Met de beperkte middelen en connectiviteit aldaar, wisten Raymond en het team, gebruikmakend van het ecosysteem van OpenStreetMap en tools als QGIS effectief overzichtskaarten voor hulpverleners te fabriceren. Hulde Raymond, je bent een voorbeeld!

    3. Rob van Loon, Ordina – Beheer GeoServer configuratie met Python scripts – Slides PDF


    GeoServer wordt op zeer veel plekken in Nederland ingezet. De bijbehorendeGeoServer Web UI om lagen en styling in te regelen is handig. Maar in veel situaties is het geautomatiseerd configureren van GeoServer veel effectiever: denk aan OTAP straten, meerdere, soms 100-en bijna identieke, lagen. Veel herhaalde, handmatige handelingen. Er bestaat al jaren een niet heel bekende REST-API om GeoServer op afstand te configureren. Deze is in de laatste versies van GeoServer steeds stabieler en krachtiger. Rob heeft daarvoor een toolkit ontwikkeld, binnenkort onder https://github.com/borrob.

    4. Willem Hofmans (JW van Aalst) – Witte Plekken en Zwarte Gaten in de BGT


    Willem en bij afwezigheid, Jan-Willem van Aalst, presenteerde een ontdekkingsreis in de krochten en details van de BGT. Daarbij viel nog veel te ontdekken: Witte Plekken, waar BGT nog niet volledig is, presenteert Jan-Willem regelmatig via zijn website. Zwarte Gaten, waar Willem met name op inging, zijn ook spannend: zitten er fouten in de BGT, of in de tools die, de vaak over-gecompliceerde, GML van BGT inlezen zoals NLExtract? Grensgebieden tussen rechthoeken en curves, het werd een reis vol verassingen. Navolging binnen NLExtract is er al.

    Verdere aankondigingen:

    Erik Meerburg (met Hans van der Kwast): de eerste QGIS Gebruikersdag op 31 jan 2018 bij IHE in Delft. Heeft inmiddels plaatsgevonden: een overweldigend success: over de 100 deelnemers. Er komt een QGIS NL Gebruikersgroep. Meer daarover binnenkort, blijf ons hier volgen!

    Erik Meerburg: de FOSS4G NL 2018: er waren op dat moment gesprekken met meerdere universiteiten/HBOs, want iedereen wil dit event graag binnenhalen. Inmiddels vergaande vorderingen: save the date: woensdag 11 juli 2018 bij Aeres Hogeschool in Almere. Meer nieuws volgt!

    Edward Mac Gillavry: na eerder discussies op deze dag rond basisregistraties met name BGT en NLExtract en het streven van OSGeo.nl op kleinere, gerichte events/workshops/hackethons/code sprints te organiseren, biedt WebMapper, ook sponsor voor de OSGeo.nl Meetup, aan om in 2018 een NLExtract Dag te organiseren. Vorm, plaats, tijd, nog te bepalen, ook meer hierover hier en via de OSGeo.nl kanalen.

    Al met al weer een mooie middag, waarbij het overgrote deel van de aanwezigen ook nog aanschoof bij het gezamenlijke diner in Cafe Dudok.

     

     

    Gary Sherman: Quick Guide to Getting Started with PyQGIS 3 on Windows

    Planet OSGeo feeds - Tue, 02/13/2018 - 10:00

    Getting started with Python and QGIS 3 can be a bit overwhelming. In this post we give you a quick start to get you up and running and maybe make your PyQGIS life a little easier.

    There are likely many ways to setup a working PyQGIS development environment---this one works pretty well.

    Contents

    Requirements
    • OSGeo4W Advanced Install of QGIS
    • pip (for installing/managing Python packages)
    • pb_tool (cross-platform tool for compiling/deploying/distributing QGIS plugin)
    • A customized startup script to set the environment (pyqgis.cmd)
    • IDE (optional)
    • Emacs (just kidding)
    • Vim (just kidding)

    We'll start with the installs.

    Installing

    Almost everything we need can be installed using the OSGeo4W installer available on the QGIS website.

    OSGeo4W

    From the QGIS website, download the appropriate network installer (32 or 64 bit) for QGIS 3.

    • Run the installer and choose the Advanced Install option
    • Install from Internet
    • Choose a directory for the install---I prefer a path without spaces such as C:\OSGeo4W
    • Accept default for local package directory and Start menu name
    • Tweak network connection option if needed on the Select Your Internet Connection screen
    • Accept default download site location
    • From the Select packages screen, select: Desktop -> qgis: QGIS Desktop

    When you click Next a bunch of additional packages will be suggested---just accept them and continue the install.

    Once complete you will have a functioning QGIS install along with the other parts we need. If you want to work with the nightly build of QGIS, choose Desktop -> qgis-dev instead.

    If you installed QGIS using the standalone installer, the easiest option is to remove it and install from OSGeo4W. You can run both the standalone and OSGeo4W versions on the same machine, but you need to be extra careful not to mix up the environment.

    Setting the Environment

    To continue with the setup, we need to set the environment by creating a .cmd script. The following is adapted from several sources, and trimmed down to the minimum. Copy and paste it into a file named pyqgis.cmd and save it to a convenient location (like your HOME directory).

    @echo off SET OSGEO4W_ROOT=C:\OSGeo4W3 call "%OSGEO4W_ROOT%"\bin\o4w_env.bat call "%OSGEO4W_ROOT%"\apps\grass\grass-7.4.0\etc\env.bat @echo off path %PATH%;%OSGEO4W_ROOT%\apps\qgis-dev\bin path %PATH%;%OSGEO4W_ROOT%\apps\grass\grass-7.4.0\lib path %PATH%;C:\OSGeo4W3\apps\Qt5\bin path %PATH%;C:\OSGeo4W3\apps\Python36\Scripts set PYTHONPATH=%PYTHONPATH%;%OSGEO4W_ROOT%\apps\qgis-dev\python set PYTHONHOME=%OSGEO4W_ROOT%\apps\Python36 set PATH=C:\Program Files\Git\bin;%PATH% cmd.exe

    You should customize the set PATH statement to add any paths you want available when working from the command line. I added paths to my git install.

    The last line starts a cmd shell with the settings specified above it. We'll see an example of starting an IDE in a bit.

    You can test to make sure all is well by double-clicking on our pyqgis.cmd script, then starting Python and attempting to import one of the QGIS modules:

    C:\Users\gsherman>python3 Python 3.6.0 (v3.6.0:41df79263a11, Dec 23 2016, 07:18:10) [MSC v.1900 32 bit (In tel)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> import qgis.core >>> import PyQt5.QtCore

    If you don't get any complaints on import, things are looking good.

    Installing pb_tool

    Open your customized shell (double-click on pyqgis.cmd to start it) to install pb_tool:

    python3 -m pip install pb_tool

    Check to see if pb_tool is installed correctly:

    C:\Users\gsherman>pb_tool Usage: pb_tool [OPTIONS] COMMAND [ARGS]... Simple Python tool to compile and deploy a QGIS plugin. For help on a command use --help after the command: pb_tool deploy --help. pb_tool requires a configuration file (default: pb_tool.cfg) that declares the files and resources used in your plugin. Plugin Builder 2.6.0 creates a config file when you generate a new plugin template. See http://g-sherman.github.io/plugin_build_tool for for an example config file. You can also use the create command to generate a best-guess config file for an existing project, then tweak as needed. Bugs and enhancement requests, see: https://github.com/g-sherman/plugin_build_tool Options: --help Show this message and exit. Commands: clean Remove compiled resource and ui files clean_docs Remove the built HTML help files from the... compile Compile the resource and ui files config Create a config file based on source files in... create Create a new plugin in the current directory... dclean Remove the deployed plugin from the... deploy Deploy the plugin to QGIS plugin directory... doc Build HTML version of the help files using... help Open the pb_tools web page in your default... list List the contents of the configuration file translate Build translations using lrelease. update Check for update to pb_tool validate Check the pb_tool.cfg file for mandatory... version Return the version of pb_tool and exit zip Package the plugin into a zip file suitable...

    If you get an error, make sure C:\OSGeo4W3\apps\Python36\Scripts is in your PATH.

    More information on using pb_tool is available on the project website.

    Working on the Command Line

    Just double-click on your pyqgis.cmd script from the Explorer or a desktop shortcut to start a cmd shell. From here you can use Python interactively and also use pb_tool to compile and deploy your plugin for testing.

    IDE Example

    By adding one line to our pyqgis.cmd script, we can start our IDE with the proper settings to recognize the QGIS libraries:

    start "PyCharm aware of Quantum GIS" /B "C:\Program Files (x86)\JetBrains\PyCharm 3.4.1\bin\pycharm.exe" %*

    We added the start statement with the path to the IDE (in this case PyCharm). If you save this to something like pycharm.cmd, you can double-click on it to start PyCharm. The same method works for other IDEs, such as PyDev.

    Within your IDE settings, point it to use the Python interpreter included with OSGeo4W---typically at: %OSGEO4W_ROOT%\bin\python3.exe. This will make it pick up all the QGIS goodies needed for development, completion, and debugging. In my case OSGEO4W_ROOT is C:\OSGeo4W3, so in the IDE, the path to the correct Python interpreter would be: C:\OSGeo4W3\bin\python3.exe.

    Make sure you adjust the paths in your .cmd scripts to match your system and software locations.

    Workflow

    Here is an example of a workflow you can use once you're setup for development.

    Creating a New Plugin
    1. Use the Plugin Builder plugin to create a starting point [1]
    2. Start your pyqgis.cmd shell
    3. Use pb_tool to compile and deploy the plugin (pb_tool deploy will do it all in one pass)
    4. Activate it in QGIS and test it out
    5. Add code, deploy, test, repeat

    Working with Existing Plugin Code

    The steps are basically the same was creating a new plugin, except we start by using pb_tool to create a new config file:

    1. Start your pyqgis.cmd shell
    2. Change to the directory containing your plugin code
    3. Use pb_tool create to create a config file
    4. Edit pb_tool.cfg to adjust/add things create may have missed
    5. Start at step 3 in Creating a New Plugin and press on

    Troubleshooting

    Assuming you have things properly installed, trouble usually stems from an incorrect environment.

    • Make sure QGIS runs and the Python console is available and working
    • Check all the paths in your pygis.cmd or your custom IDE cmd script
    • Make sure your IDE is using the Python interpreter that comes with OSGeo4W

    [1] Plugin Builder 3.x generates a pb_tool config file

    Blog 2 Engenheiros: Como organizar seus mapas no ArcGIS e QGIS?

    Planet OSGeo feeds - Tue, 02/13/2018 - 08:07

    Nem todas as pessoas são organizadas, e não é diferente para aquelas pessoas que passaram 5+ anos numa graduação.

    Se eu te solicita-se material das aulas de Química Orgânica do seu curso de graduação, você teria condições de me entregar eles? Ou eles estariam perdidos, em uma pasta muito remota, que nem mesmo o Windows conseguiria achar.

    Se você tem problemas de organização e trabalha com geoprocessamento, iremos dar dicas para você se organizar e não perder os arquivos do seu SIG.

    Como é a sua área trabalho? Organizada ou uma bagunça? Fonte: ATRL. Vantagens da Organização

    Ao tornar-se uma pessoa organizada, você notará benefícios como passar menos tempo procurando seus arquivos e corrigindo erros. Logo, se você não perde tempo com isso, terá mais tempo para realizar atividades produtivas.

    Não pense que as boas práticas de arquivamento de documentos se foram pois não utilizamos mais papeis.

    Aaron Lynn, no site Asian Efficiency, apresenta algumas regras que devem ser seguidas para deixar seu computador organizado. São elas:

    • Não salve documentos na Área de Trabalho;
    • Limite a criação de novas pastas;
    • Acostume-se a pensar em hierarquias;
    • Crie uma pasta para projetos concluídos (arquivo morto).

    Dentre essas regras, a ideia de hierarquia é a mais importante. Pois você deverá classificar seus documentos, por exemplo, como Pessoal ou Trabalho; dentro da pasta Trabalho, você pode ainda ter outras classes, tais como Projetos em Andamento, Projetos Concluídos e Documentos Administrativos.

    Exemplo de organização usando hierarquia para uma pasta de trabalho.

    Parece complicado? Se você se acostumar a ser organizado, as coisas começarão a fluir facilmente. Ainda parece difícil? Segue alguns sites que podem te auxiliar a ser mais organizado:

    Organização no SIG

    Agora vamos aplicar um pouco dessas ideias para gerenciar nossos arquivos do nosso Sistema de Informações Geográficas. Assista nosso vídeo e descubra.

    Siga nossas dicas e você será mais organizado. Organização é uma habilidade que desenvolvemos com o tempo, portanto, não desista.

    E caso você tenha alguma sugestão de organização, deixa ela nos comentários.

    Free and Open Source GIS Ramblings: TimeManager 2.5 published

    Planet OSGeo feeds - Mon, 02/12/2018 - 22:39

    TimeManager 2.5 is quite likely going to be the final TimeManager release for the QGIS 2 series. It comes with a couple of bug fixes and enhancements:

    • Fixed #245: updated help.htm
    • Fixed #240: now hiding unmanageable WFS layers
    • Fixed #220: fixed issues with label size
    • Fixed #194: now exposing additional functions: animation_time_frame_size, animation_time_frame_type, animation_start_datetime, animation_end_datetime

    Besides updating the help, I also decided to display it more prominently in the settings dialog (similarly to how the help is displayed in the field calculator or in Processing):

    So far, I haven’t started porting to QGIS 3 yet. If you are interested in TimeManager and want to help, please get in touch.

    On this note, let me leave you with a couple of animation inspirations from the Twitterverse:

    Here's my attempt, Thanks @tjukanov for the inspiration and @dig_geo_com for the Tutorial. Distances traveled in 2,5,7,9,11,13,15 minutes in Cluj-Napoca, Romania. Made with @QGIS#TimeManager,@openstreetmap, data from @here pic.twitter.com/tEhXfy6CAs

    — vincze istvan (@spincev) February 3, 2018

    More typographic experiments with Danish ship data.
    – 24 hours of ship traffic
    – Size of ship name defined by a combination of vessel size and speed
    – Direction of the name is defined by vessel's course pic.twitter.com/2p16Qo1Kuf

    — Topi Tjukanov (@tjukanov) January 11, 2018

    How the City of #Zurich has grown since 1875.

    Made with @QGIS#TimeManager.
    Thx to @underdarkGIS and @tjukanov for the inspiration.#geogiffery #DataViz #Map #cartography pic.twitter.com/TY2flppsVA

    — Marco Sieber (@DonGoginho) January 17, 2018

    gvSIG Team: Asociación gvSIG participará en el ILoveFS 2018

    Planet OSGeo feeds - Mon, 02/12/2018 - 13:09

    Por tercer año consecutivo Datalab organiza en MediaLab Prado el IloveFS18, un espacio de demostración de cariño y amor al Software Libre, como no, el 13 y el 14 de febrero.

    La Asociación gvSIG estará presente mostrando que el SIG libre puede utilizarse para casi cualquier ámbito de nuestra vida. Facilitaremos un taller de gvSIG básico aplicado a la labor periodística para que todo el mundo pueda aprender Geomática Libre de manera práctica y divertida.

    La cita será de 18:45 a 20:30 el martes 13 y de 18:45 a 20:00 el miércoles 14 en las instalaciones de MediaLab Prado.

    Por supuesto, el evento y los talleres son completamente gratuitos y sólo necesitáis registraros en la web habilitada para ello.

    Podéis ver la información completa en http://medialab-prado.es/article/ilovefs18

    Y podéis descargar la versión portable para vuestro sistema operativo en http://www.gvsig.com/es/productos/gvsig-desktop/descargas

     

    CARTO Inside Blog: ETL into CARTO with ogr2ogr

    Planet OSGeo feeds - Mon, 02/12/2018 - 11:39

    The default CARTO data importer is a pretty convenient way to quickly get data into the platform, but for enterprises setting up automated update it has some limitations:

    • there’s no way to define type coercions for CSV data;
    • some common GIS formats like File Geodatabase aren’t supported;
    • the sync facility is “pull” only, so data behind a firewall is inaccessible; and,
    • the sync cannot automatically filter the data before loading it into the system.

    Fortunately, there’s a handy command-line tool that can automate many common enterprise data loads: ogr2ogr.

    Basic Operation

    ogr2ogr has a well-earned reputation for being hard to use. The commandline options are plentiful and terse, the standard documentation page lacks examples, and format-specific documentation is hidden away with the driver documentation.

    Shapefile

    The basic structure of an ogr2ogr call is “ogr2ogr -f format destination source”. Here’s a simple shapefile load.

    ogr2ogr \ --debug ON \ --config CARTO_API_KEY abcdefghijabcdefghijabcdefghijabcdefghij \ -t_srs "EPSG:4326" \ -nln interesting \ -f Carto \ "Carto:pramsey" \ interesting_things.shp

    The parameters are:

    • –debug turns on verbose debugging, which is useful during development to see what’s happening behind the scenes.
    • –config is used to pass generic “configuration parameters”. In this case we pass our CARTO API key so we are allowed to write to the database.
    • -t_srs is the “target spatial reference system”, telling ogr2ogr to convert the spatial coordinates to “EPSG:4326” (WGS84) before writing them to CARTO. The CARTO driver expects inputs in WGS84, so this step is mandatory.
    • -nln is the “new layer name”, so the name of the uploaded table can differ from that of the input file.
    • -f is the format of the destination layer, so for uploads to CARTO, it is always “Carto”.
    • Carto:pramsey is the “destination datasource”, so it’s a CARTO source, in the “pramsey” account. Change this to your user name. (Note for multi-user accounts: you must supply your user name here, not your organization name.)
    • interesting_things.shp is the “source datasource”, which for a shapefile is just the path to the file.
    File Geodatabase

    Loading a File Geodatabase is almost the same as loading a shapefile, except that a file geodatabase can contain multiple layers, so the conversion must also specify which layer to convert, by adding the source layer name after the data source. You can load multiple layers in one run by providing multiple layer names.

    ogr2ogr \ --debug ON \ --config CARTO_API_KEY abcdefghijabcdefghijabcdefghijabcdefghij \ -t_srs "EPSG:4326" \ -nln cities \ -f Carto \ "Carto:pramsey" \ CountyData.gdb Cities

    In this example, we take the “Cities” layer from the county database, and write it into the “cities” table of CARTO. Note that if you do not re-map the layer name to all lower case, you’ll get a mixed case layer in CARTO, which you may not want.

    Filtering

    You can use OGR on any input data source to filter the data prior to loading. This can be useful for loads of large inputs that are “only the data since time X” or “only the data in this region”, like this:

    ogr2ogr \ --debug ON \ --config CARTO_API_KEY abcdefghijabcdefghijabcdefghijabcdefghij \ -t_srs "EPSG:4326" \ -nln cities \ -f Carto \ -sql "SELECT * FROM Cities WHERE state_fips = 53" \ "Carto:pramsey" \ CountyData.gdb Cities

    Since the filter is just a SQL statement, the filter can both reduce the number of records and also apply transforms to the output on the way: reduce the number of columns, apply some data transformations, anything that is possible using the SQLite dialect of SQL.

    Overwrite or Append

    By default, ogr2ogr runs in “append” mode (you can force it with the -append flag, so if you run the same translation multiple times, you’ll get rows added into your table. This be useful for processes that regularly take the most recent entries and copy them into CARTO.

    For translations where you want to replace the existing table, use the -overwrite mode, which will drop the existing table, and create a new one in its place.

    Because of some limitations in how the OGR CARTO driver handles primary keys, the OGR -update mode does not work correctly.

    OGR Virtual Format

    As you can see, the command-line complexity of an OGR conversions starts high. The complexity only goes up as advanced features like filtering and arbitrary SQL are added.

    To contain the complexity in one location, you can use the OGR “virtual format”, VRT files, to define your data sources. This is handy for managing a library of conversions in source control. Each data source becomes it’s own VRT file, and the actual OGR commands become smaller.

    CSV Type Enforcement

    CSV files are convenient ways of passing data, but they are under-defined: they supply column names, but not column types. This forces CSV consumers to do type guessing based on the input data, or to coerce every input to a lowest common denominator string type.

    Particularly for repeated and automated uploads it would nice to define the column types once beforehand and have them respected in the final CARTO table.

    For example, take this tiny CSV file:

    longitude,latitude,name,the_date,the_double,the_int,the_int16,the_int_as_str,the_datetime -120,51,"First Place",2018-01-01,2.3,123456789,1234,00001234,2014-03-04 08:12:23 -121,52,"Second Place",2017-02-02,4.3,423456789,4234,00004234,"2015-05-05 09:15:25"

    Using a VRT, we can define a CSV file as a source, and also add the rich metadata needed to support proper type definitions:

    <OGRVRTDataSource> <OGRVRTLayer name="test_csv"> <SrcDataSource>/data/exports/test_csv.csv</SrcDataSource> <GeometryField encoding="PointFromColumns" x="longitude" y="latitude"/> <GeometryType>wkbPoint</GeometryType> <LayerSRS>WGS84</LayerSRS> <OpenOptions> <OOI key="EMPTY_STRING_AS_NULL">YES</OOI> </OpenOptions> <Field name="name" type="String" nullable="false" /> <Field name="a_date" type="Date" src="the_date" nullable="true" /> <Field name="the_double" type="Real" nullable="true" /> <Field name="the_int" type="Integer" nullable="true" /> <Field name="the_int16" type="Integer" subtype="Int16" nullable="true" /> <Field name="the_int_as_str" type="String" nullable="true" /> <Field name="the_datetime" type="DateTime" nullable="true" /> </OGRVRTLayer> </OGRVRTDataSource>

    This example has a number of things going on:

    • The <SrcDataSource> is an OGR connection string, as defined in the driver documentation for the format. For a CSV, it’s just the path to a file with a “csv” extension.
    • The <GeometryField> line maps coordinate columns into a point geometry.
    • The <LayerSRS> confirms the coordinates are WGS84. They could also be some planar format, and OGR can reproject them if requested.
    • The <OpenOptions> let us pass one of the many CSV open options.
    • The <Field> type definitions, using the “type” attribute to explicitly define types, including obscure ones like 16-bit integers.
    • Column renaming, in the “a_date” <Field>, maps the source column name “the_date” to “a_date” in the target.
    • Null enforcement, in the “name” <Field>, creates a target column with a NOT NULL constraint.

    To execute the translation, we use the VRT as the source argument in the ogr2ogr call.

    ogr2ogr \ --debug ON \ --config CARTO_API_KEY abcdefghijabcdefghijabcdefghijabcdefghij \ -t_srs "EPSG:4326" \ -f Carto \ "Carto:pramsey" \ test_csv.vrt Database-side SQL

    Imagine you have an Oracle database with sales information in it, and you want to upload a weekly snaphot of transactions. The database is behind the firewall, and the transactions need to be joined to location data in order to mapped. How to do it?

    With VRT tables and the OGR Oracle driver, it’s just some more configuration during the load step:

    <OGRVRTDataSource> <OGRVRTLayer name="detroit_locations"> <SrcDataSource>OCI:scott/password@ora.company.com</SrcDataSource> <LayerSRS>EPSG:26917</LayerSRS> <GeometryField encoding="PointFromColumns" x="easting" y="northing"/> <SrcSQL> SELECT sales.sku, sales.amount, sales.tos, locs.latitude, locs,longitude FROM sales JOIN locs ON sales.loc_id = locs.loc_id WHERE locs.city = 'Detroit' AND sales.transaction_date > '2018-01-01' </SrcSQL> </OGRVRTLayer> </OGRVRTDataSource>

    Some things to note in this example:

    • The <SrcDataSource> holds the Oracle connection string
    • The coordinates are stored in UTM17, in northing/easting columns, but we can still easily map them into a point type for reprojection later.
    • The output data source is actually the result of a join executed on the Oracle database, attributing each sale with the location it was made. We don’t have to ship the tables to CARTO separately.

    The ability to run any SQL on the source database is a very powerful tool to ensure that the uploaded data is “just right” before it arrives on the CARTO side for analysis and display.

    As before, the VRT is run with a simple execution of the ogr2ogr command line:

    ogr2ogr \ --debug ON \ --config CARTO_API_KEY abcdefghijabcdefghijabcdefghijabcdefghij \ -t_srs "EPSG:4326" \ -f Carto \ "Carto:pramsey" \ test_oracle.vrt Multi-format Sources

    Suppose you have a source of attribute information and a source of location information, but they are in different formats in different databases: how to bring them together in CARTO? One way, as usual, would be to upload them separately and join them on the CARTO side with SQL. Another way is to use the power of ogr2ogr and VRT to do the join during the data upload.

    For example, imagine having transaction data in a PostgreSQL database, and store locations in a Geodatabase. How to bring them together? Here’s a joined_stores.vrt file that does the join in ogr2ogr:

    <OGRVRTDataSource> <OGRVRTLayer name="sales_data"> <SrcDataSource>Pg:dbname=pramsey</SrcDataSource> <SrcLayer>sales.sales_data_2017</SrcLayer> </OGRVRTLayer> <OGRVRTLayer name="stores"> <SrcDataSource>store_gis.gdb</SrcDataSource> <SrcLayer>Stores</SrcLayer> </OGRVRTLayer> <OGRVRTLayer name="joined"> <SrcDataSource>joined_stores.vrt</SrcDataSource> <SrcSQL dialect="SQLITE"> SELECT stores.*, sales_data.* FROM sales_data JOIN stores ON sales_data.store_id = stores.store_id </SrcSQL> </OGRVRTLayer> </OGRVRTDataSource>

    Some things to note:

    • The “joined” layer uses the VRT file itself in the <SrcDataSource> definition!
    • Each <OGRVRTLayer> is a full-fledged VRT layer, so you can do extra processing in them. Apply type definitions to CSV, run complex SQL on a remote database, whatever you want.
    • The join layer uses the “SQLite” dialect, so anything available in SQLite is available to you in the join step.
    Almost an ETL

    Combining the ability to read and write from multiple formats with the basic functionality of the SQLite engine, and chaining operations through multi-layer VRT layers, ogr2ogr provides the core functions of an “extract-transform-load” engine, in a package that is easy to automate and maintain.

    For users with data behind a firewall, who need more complex processing during their loads, or who have data locked in formats the CARTO importer cannot read, ogr2ogr should be an essential tool.

    Getting ogr2ogr

    OGR is a subset of the GDAL suite of libraries and tools, so you need to install GDAL to get ogr2ogr

    • For Linux, look for “gdal” packages in your Linux distribution of choice.
    • For Mac OS X, use the GDAL Framework builds.
    • For Windows, use the MS4 package system or pull the Mapserver builds from GisInternals and use the included GDAL binaries.

    gvSIG Team: GIS applied to Municipality Management: Module 11 ‘Reprojecting vector layers’

    Planet OSGeo feeds - Mon, 02/12/2018 - 09:29

    The video of the eleventh module is now available, in which we will show how to reproject vector layers.

    Sometimes municipalities need external geographic information to work, for example cartography published by another administration, such as regional or national. That cartography can be in a different system than technicians usually work on in the municipality. If we don’t take the reference systems into account, both cartographies would not be overlapped correctly.

    The municipality technicians can also use old cartography, which is in an obsolete reference system, and they need to have it in an updated reference system. For this, it will be necessary to reproject that cartography.

    In module 2 you can consult all the information related to the reference systems.

    Apart from reprojecting from one reference system to another one, sometimes it will be necessary to apply a transformation to improve the reprojection. For example in the case of Spain, to reproject a layer available in ED50, the official reference system until a few years ago, to ETRS89, the official system currently, it is necessary to apply a transformation by grid, otherwise we would have a difference of about 7 meters between these layers.

    The cartography to follow this video can be downloaded from this link.

    Here you have the videotutorial of this new module:

    Related posts:

    From GIS to Remote Sensing: Available the User Manual of the Semi-Automatic Classification Plugin v. 6

    Planet OSGeo feeds - Sat, 02/10/2018 - 16:34
    I've updated the user manual of the Semi-Automatic Classification Plugin (SCP) for the new version 6.This updated version contains the description of all the tools included in SCP as well as a brief introduction to the remote sensing definitions, and the first basic tutorial.
    The user manual in English is available at this link. Also, other languages are available, although some translations are incomplete. 
    I'd like to deeply thank all the volunteers that have translated the previous versions of this user manual, and I invite you to help translating this new version to your language.
    It is possible to easily translate the user manual to any language, because it is written in reStructuredText as markup language (using Sphinx). Therefore, your contribution is fundamental for the translation of the manual to your language.


    Jackie Ng: One MapGuide/FDO build system to rule them all?

    Planet OSGeo feeds - Fri, 02/09/2018 - 17:38
    Before the bombshell of the death of Autodesk Infrastructure Map Server was dropped, I was putting the final finishing touches of making (pun intended) the MapGuide/FDO build experience on Linux a more pleasant experience.

    Namely, I had it with autotools (some may describe it as autohell) as the way to build MapGuide/FDO on Linux. For FDO, this was the original motivation for introducing CMake as an alternative. CMake is a more pleasant way to build software on Linux over autotools because:
    1. CMake builds your sources outside of the source tree. If you're doing development on a SVN working copy this is an absolute boon as it means when it comes time to commit any changes, you don't have to deal with sifting through tons of autotools-generated junk that is left in your actual SVN wc.
    2. CMake builds are faster than their autotools counterpart.
    3. It is much easier to find and consume external libraries with CMake than it is through autotools, which makes build times faster because we can just source system-installed copies of thirdparty libraries we use, instead of waste time having to build these copies (in our internal thirdparty source tree) ourselves. If we are able to use system-installed copies of libraries when building FDO, then we can take advantage of SVN sparse checkouts and be able to skip downloading whole chunks of thirdparty library sources that we never have to build!
    Sadly, while this sounds nice in theory, the CMake way to build FDO had fallen into a state of disrepair. My distaste for autotools was sufficient motivation to get the CMake build back into working condition. Several weeks of bashing at various CMakeLists.txt files later, the FDO CMake build was operational again and had some several major advantages over the autotools build (in addition to what was already mentioned):
    • We can setup the CMake to generate build configurations for Ninja instead of standard make. A ninja-powered CMake build is faster than standard make ^.
    • On Ubuntu 14.04 LTS (the current Ubuntu version we're targeting), all the thirdparty libraries we use were available for us to apt-get install in the right version ranges, and the CMake build can take advantage of all of them. Not a single internal thirdparty library copy needs to be built!
    • We can easily light up compiler features like AddressSanitizer and linking with the faster gold instead of ld. AddressSanitizer in particular easily helped us catch some issues that have flew under the radar.
    • All of the unit tests are build-able and more importantly ... executable outside the source tree, making it easier to fix up whatever was failing.
    Although we now had a functional FDO CMake build. MapGuide still was built on Linux using autotools. So for the same reasons and motivations, I started the process of introducing CMake to the MapGuide build system for Linux.

    Unlike FDO, MapGuide still needed some of the internal thirdparty libraries built.
    • DBXML - No ubuntu package available, though we can get it to build against a system-provided version of xerces, so we can at least skip building that part of DBXML.
    • Apache HTTPD - Ubuntu package available, but having MapGuide be able to integrate with an Ubuntu-provided httpd installation was not in the scope of this work, even though this is a nice thing to have.
    • PHP - Same reasons as Apache HTTPD
    • Antigrain Geometry - No ubuntu package available. Also the AGG sources are basically wedded to our Renderers project anyways.
    • DWF Toolkit - No ubuntu package available
    • CS-Map - No ubuntu package available
    For everything else, Ubuntu provided the package in the right version ranges for CMake to take advantage of. Another few weeks of bashing various CMakeLists.txt files into shape and we had FDO and MapGuide both build-able on Linux via CMake. To solve the problem of still needing to build some internal thirdparty libs, but still be able to retain the CMake quality of everything is built outside the source tree, some wrapper shell scripts are provided that will copy applicable thirdparty library sources out of the current directory, build them in their copied directories and then invoke CMake and pass in all the required parameters so that it will know where to look for the internal libraries to link against when it comes to build MapGuide proper.

    This was also backported to FDO, so that on distros where we do not have all our required thirdparty libraries available, we can selectively build internal copies and be able to find/link the rest, and have CMake take care of all of that for us.

    So what's with the title of this post?

    Remember when I wrote about how interesting vcpkg was?

    What is best used with vcpkg to easily consume thirdparty libraries on Windows? Why CMake of course! Now building MapGuide on Windows via CMake is not on the immediate horizon. We'll still be maintaining Visual Studio project files by hand (instead of auto-generating them with CMake) for the immediate future, but can you imagine being able to build FDO and MapGuide on both Windows and Linux with CMake and not have to waste time on huge SVN checkouts and building thirdparty libraries? That future is starting to look real possible now!

    For the next major release of MapGuide Open Source, it is my plan to use CMake over autotools as the way to build both MapGuide and FDO on Linux.

    ^ Well, the ninja-powered CMake build used to be blazing fast until Meltdown and Spectre happened. My dev environment got the OS security patches and whatever build performance gains that were made through ninja and CMake were instantly wiped out and we were back to square one in terms of build time. Still, the autotools build performed worse after the meltdown patches, so while CMake still beats the autotools build in terms of build time, we ultimately gained nothing on this front.

    Thanks Intel!!!

    Pages

    Subscribe to soilinformationstandards.org aggregator