In the context of my research, I would need to be able import the layers from EarthMap (FAO) and Global Forest Watch ? Is there any particular link to connect these databases to QGIS?
After a first look earthdata seems to be a geoportal.
A lot of choices in inputs especially for ecological monitoring, for teachers, or managers. Bu the scale is very large.On this portal you can export the results in an image format. You can acces to the tools by this link : https://earthdata.nasa.gov/learn/pathfinders/gis-pathfinder/gis-tools
It's the same analyze for Global Forest Watch. This web site use Mapbuilder.
Data from Global Forest Watch, an online forest monitoring and alert system is now available on Esri’s ArcGIS Online GIS cloud service since 2014. By using a portal on Esri’s platform to access GFW Global Forest Watch satellite data and crowd-sourced information, people can add powerful maps, datasets, and applications to their forest projects and better analyze indicators of forest change.
Michael John Patrick ; Thanks for your reply, I am just not sure I understand your purpose. I would like to be able to extract/analyse within my specific polygons the data showed on the map proposed Earthmap or Global Forest Watch. That's why I asked if I could import them on QGIS.
Vanderheyden Lisa INRE: 'Purpose'. Because "import the layers" and "connect these databases" encompasses quite a wide range of QGIS functionality and capabilities, the choice of which is dependent on the capabilities and characteristics of the data publisher, and frequently multiple choices are involved. This can range from very simple CSV plain text feeds in real time ( https://www.geodose.com/2020/09/realtime%20live%20data%20visualization%20qgis.html ) to global datasets in somewhat uncommon projects - GHSL is in Mollweide https://desktop.arcgis.com/en/arcmap/10.3/guide-books/map-projections/mollweide.htm , Soilgrids is in https://www.isric.org/explore/soilgrids/faq-soilgrids#How_can_I_use_the_Homolosine_projection ). Geoportals can have a range of capabilities, from simple 'display only' of an image ( WMS ) to full queries and re-projection of the data for your download ( WFS, Web Feature Service: for retrieving or altering feature descriptions ). Full list at WFS – Web Feature Service: for retrieving or altering feature description. So these can represent direct database connections using ODBC, and API call, through a custom plugin provided by the publisher, etc.
A GIS will happily display data in different projections using 'on the fly projection', but analysis between combinations of vector in different projections can have variable results if you don't prepare before hand. Your project might use a particular local projection ( like Washington State Plane North ), a WMS can be showing Web Mercator, and the Soil data is in Homolosine_projection ). Your analysis processing may depend on distances, or shape, or direction - which will be affected by the various data's original projection ( https://en.wikipedia.org/wiki/Tissot%27s_indicatrix ). If you are using raster data, the mismatches can cause all sorts of artifacts through a tool chain. ( and 'Layer' usually means the data plus the cartographic styling, which is another can of worms ).
Some, most, or all of these a usually documented somewhere on the publisher's web site, either server connection protocols, service strings, or file metadata. NASA and the USGS are really, really good, others from NGOs are occasionally derived from other data and are really really bad.
That's why I asked for the links, so I could give you a complete answer with any required tips and cautions.
A simple answer is 'Yes, QGIS can do that", but the real answer is 'Yes, but however ...".
Vanderheyden Lisa I may still not understand your difficulty - did the Google Earth Engine plug-ins not appear in the QGIS application 'Search for Plug-ins'? Also, in addition to the GEE generic plug-ins, there are more specialized ones like Firehunter ad Sentinel-2 utilities, and also some plug-in authors don't list theirs in the QGIS repository, for various reasons.
Data:
"... Google Earth Engine tutorial for using Hansen et al. (2013) global forest cover and change data and Forest Monitoring for Action (FORMA, Hammer et al. 2009) data from Global Forest Watch. This tutorial provides examples of how to use Earth Engine to visualize these data, how to compute forest change over time and other statistics within a region of interest and how to download both the data and results of analyses. " ( from
"Google Earth Engine plugin for QGIS - Integrates Google Earth Engine with QGIS using Python API."( from https://github.com/gee-community/qgis-earthengine-plugin )
Plug-in Code Examples:
"This repository is a collection of 220+ Python examples for the Google Earth Engine plugin for QGIS. I developed these Python examples by converting all the JavaScript examples (except those not yet supported by the plugin) from the Google Earth Engine API Documentation. Additionally, some examples were adapted from Gena’s examples and the Earth Engine API examples. Kudos to Gennadii Donchyts for developing this amazing Google Earth Engine plugin for QGIS." from (
Michael John Patrick Herve Parmentier Jean-François Bastin
: Thank you for all your replies :) I think that all gives me a deep insight on what I need.
My research is conducted in the context restoration projects monitoring founded by a company. Metholds must be quiete usefriendly, not too long to implement, different employees can handle. At the end of my contract, others people will be willing to use my suggestion, protocols. Through that question has a double interest :
1) How/are the data from Earthmap and Global Forest Watch could useful to monitor restoration project? So can we derive quantitative indicators?
--> You provided me different databases to have access to these data, but also I just find that within the FAQ section:
EarthMap: data can be exported on Geotiff format (see Help center)
Global Forest Watch: Downloadable data from Global Forest Watch Open Data Portal : https://data.globalforestwatch.org
Jean-François Bastin
, thank you for your kind suggestion, just find the information now, it would not be necessary so.
2) I'm realising a benchmark of geospatial existing technologies for the company in that context. In my opinion, that's why I try to know if geoportals can be connected to QGIS. I need to use at least basic system GIS operation : matrice calcultor (for zonation), statistic zonal vectors, sampling points methods.
--> Our discussion clarifies this point too. Connecting these geoportals to QGIS is not existing right now. So it would be a result of my benchmark about QGIS performance. I thought about QGIS because it's free, quiete complete. From this discussion, I remind Arcgis online and Mapbuilder to investigate.
About google Earth Engine Editor, I am feeling able to train myself. I doubt it would be userfriendly to other people not always educated to GIS basis? What do you think about it?
Vanderheyden Lisa ( The general theme of the entire post indicates moving from consideration of isolated possible features to a much broader attempt to define the architecture of a geospatial system of some sort, ie. more than just algorithms and functions, but potential other issues, integration, usability, licensing etc. )
> My research is conducted in the context restoration projects monitoring founded by a company.
If the eventual use is commercial use, not research or non-profit, one might want to begin to keep track of how much the licensing, and in the case of ArcGIS Online, recurrent fees. See https://doc.arcgis.com/en/arcgis-online/administer/credits.htm , and scroll down to the "Credits by capability" chart to the rows labeled "Spatial Analysis" and "Imagery Analysis", but also " ArcGIS GeoEnrichment Service" and " Tile Generation" if you are considering some sort of web user interface. Of course, some of these functions can alternatively be done on the desktop, but that requires another sort of ArcGIS license, and some sort of manual routine or automation to synchronize the Online with the local.
Additionally, the local processing can be done with components from the OSgeo stack - from GDAL/OGR scripts, to within QGIS ( https://docs.qgis.org/2.8/en/docs/user_manual/processing/modeler.html or Python, to inside a PostGresSQL database ( PostGIS https://postgis.net/docs/PostGIS_Special_Functions_Index.html#PostGIS_RasterFunctions ).
Companies are invariably price and cost conscious. In addition to the software, if the use is commercial, you need to consider the terms and conditions of the data carefully, note "... Some data displayed on the Global Forest Watch platform was developed by other organizations and may carry other licensing or permissions." at https://data.globalforestwatch.org/pages/data-policy
> Methods must be quiet user friendly, not too long to implement, different employees can handle. At the end of my contract, other people will be willing to use my suggestion, protocols.
You have just invoked the age-old 'iron triangle' ( https://armedia.com/blog/the-fastgoodcheap-rule-of-software-development-2/ ). There is another fundamental rule, that 'user friendly' usually means some sort of GUI, and 95% of the difficulty of a software is concentrated in the user interface. It also implies some degree of automation or bundling of tasks. The general rule is the 'simpler' an interface is for a complex task, the cost of the system rises exponentially.
Compare two implementations that provide the same information - a Python script with a one page document of instructions launched when human gets a calendar notification to run by clicking on a desktop icon and dumps a KML file of areas into a folder and then the user just opens the KML in Google Earth Pro, the other a GUI application web interface with a map display. The first could be implemented and tested in a day and run happily on any operating system for a decade, the other might take weeks to develop and break constantly when some dependency or browser changed. Similar to the first example of a script would be using Google Earth Engine ( https://un-spider.org/links-and-resources/gis-rs-software/google-earth-engine-google ).
> Through that question has a double interest :
> 1) How/are the data from Earthmap and Global Forest Watch could useful to monitor restoration project? So can we derive quantitative indicators?
Are you certain those datasets have 'fitness for use' ( https://www.researchgate.net/publication/262646692_Evaluating_the_fitness_for_use_of_spatial_data_sets_to_promote_quality_in_ecological_assessment_and_monitoring ) to get the expected results? See http://gfw.blog.s3.amazonaws.com/Data%20Playbook/GFW%20Data%20Playbook%20v3.pdf , their data is not necessarily uniform over the entire globe, it is aggregated from many different sources, over different time scales, different latencies for updates, and varying precision and accuracy ( see the Tiering of prioritization ).
It is probably very, very good and frequent for the United State and European Union, other areas might be months or years intervals between updates. Also, the resolution and classification accuracy will vary - their 'minimum' is 30 x 30 meters, which might be okay for calculating total canopy for a country, but useless if one is monitoring vegetation regrowth at artisanal mining sites that may be a single or couple of pixels, or very narrow linear features like stream bank restoration.
The quantitative indicators are fundamentally going to depend on those factors for your area of interest. The indicators will also depend on the tempo of the available time series and whether a simple binary change detection is sufficient, or a state change from one classification to another ( forest to bare earth ), or a more complex change metric like between growth stages of a canopy like from multi-spectral unmixing ( 15 feet to 30 feet between a five year interval ).
> --> You provided me different databases to have access to these data, but also I just find that within the FAQ section:
> EarthMap: data can be exported on Geotiff format (see Help ceunter)
> Global Forest Watch: Downloadable data from Global Forest Watch Open Data Portal : https://data.globalforestwatch.org
Almost all authoritative data sources offer multiple means of accessing the data through web services, API calls, and / or FTP file downloads. Which one to use depends on your use case.
> 2) I'm realizing a benchmark of geospatial existing technologies for the company in that context. In my opinion, that's why I try to know if geoportals can be connected to QGIS. I need to use at least basic system GIS operation : matrix calculator (for zonation), statistic zonal vectors, sampling points methods.
Underneath all of it is GDAL ( https://gistbok.ucgis.org/bok-topics/gdalogr-and-geospatial-data-io-libraries ), and then Python-ic ( https://gisgeography.com/python-libraries-gis-mapping/ ), Java ( https://geotools.org/about.html ) or C++ libraries which do more specialized algorithms - see Figure 3 in the previous link.
All the GIS user interface applications ( like QGIS and ArcMap ) with their toolboxes and model builders are built on top of those. So it is possible to do the same geoprocessing at all levels, from command line to cloud based like google Earth Engine.
> --> Our discussion clarifies this point too. Connecting these geoportals to QGIS is not existing right now. So it would be a result of my benchmark about QGIS performance. I thought about QGIS because it's free, quite complete. From this discussion, I remind Arcgis online and Mapbuilder to investigate.
I don't understand why you believe this ("Connecting these geoportals to QGIS is not existing right now"). For example, my other previous answer showed the existence of a QGIS plugin and examples of hundreds of samples for connecting to Google Earth Engine. QGIS ( and especially when paired with other OSGeo components ) can 'connect' to practically any web service, database connection, geoportal API ( like the USGS WFS https://apps.nationalmap.gov/services/ ), or file format import/ export. Not just QGIS, but practically any GIS these days has similar capabilities, but also languages like Python and R, and scripts like GDAL. Similarly, almost all authoritative data sources offer multiple means of accessing the data through web services, API calls, and FTP file downloads.
> About Google Earth Engine Editor, I am feeling able to train myself. I doubt it would be user friendly to other people not always educated to GIS basis? What do you think about it?
There is considerable difference in user skill levels between "designing a GIS process" and "writing a GIS script from scratch" and "editing an existing GIS script" and "just running the GIS script" and "viewing the results of a GIS script" - and rarely are all required all of the time by everyone over the lifetime of the project.
For instance, I can't program in JavaScript, but can easily cut and paste and make minor edits in Google Earth Engine with a cheat sheet ( https://developers.google.com/earth-engine/tutorials/community/beginners-cookbook ).
Realistically, the universe itself isn't really 'user friendly', and being able to reliably operate on certain tasks will require a certain expected level of skill.
If you need an UI for viewing and simple manipulation, tools like Leaflet ( https://leafletjs.com/plugins.html ) offer various sorts of interactivity - like a side by side comparison slider ( http://lab.digital-democracy.org/leaflet-side-by-side/ ) using two maps from your scripts in Google Earth Engine.
I have no idea what actual features, attributes, at what scale, or where on earth, or any of the characteristics of your operating situation, so it is impossible to be specific.
For me, the process is iterative, with a definite examination of the ROI before advancing to more capability and automation, because many factors might make any given end result of functionality simply not possible.
Very loosely:
1. Open the available data in QGIS by whatever connection means, usually as a WMS. Create a long narrow transect or multiple small areas I think are representative of the features, time, and space I am interested in. Manually use the QGIS processing tools with the history panel open to perform any operations needed to get to the end result. Import the preliminary results into Google Earth Pro and use GE Pro's time slider for a rough eyeball validation.
2. Use the history of operations I conducted in (1) to build a generalized process in the QGIS graphical modeler ( https://docs.qgis.org/2.8/en/docs/user_manual/processing/modeler.html ) .
3. If that works well, then maybe automate the input and output of the model
4. Maybe one of the local processing steps takes hours on my local machine, so I replace those with a call to do the processing on Google Earth Engine
5. Once that all runs easily, I take it outside of QGIS as a Python Script in an anaconda NoteBook or maybe totally do it all in Earth Engine.
6. Write the appropriate Windows PowerShell or Linux Bash scripting for automated housekeeping and involving the processing.
7. All along, make simple GUI-ish single feature Leaflet displays of the intermediate results. Make a QGIS map atlas for PDF / print using the results. Or make a web map using https://docs.qgis.org/2.14/en/docs/user_manual/working_with_ogc/ogc_server_support.html, or upload the results to ArcGIS online for display to a cell phone.
8. Repeat as required for each new functionality, or improve the performance of an existing function.
Point being, take something simple simple, once, through any whole pipeline so you can figure out what works for you and the customer, and your available time and resources. Maybe you decide to go from 1, and then iterate from 2, skip over to one of the number 8 alternatives. Maybe 1, first part of 2, then 5 all in Earth Engine.
But from experience, it is (1) which will probably be major disappointment if your intent is using the global datasets for change detection. In a way, it's backwards, you should decide what changes your restorations cause, and the resolution and time scales those occur within, and then discover data that for your specific region of interest within those constraints. There may very well be a specific platform with a sensor which far more optimized to detect whatever you are monitoring.
@Herve Parmentier @ Michael John Patrick : Thank you for your replies
For the technical aspects, I'll study these comments, options and resources carefully.
Concercing the global databases, I am grateful having that oppportunity to study to which extent they can be used to guide restoration on the field. From Monday, I have analysed all the metadata behind these databases. My conclusion are similar, they are probably too coarse for monitoring according to my need. They can help to understand the baseline (pre-restauration conditions) of the project.