Quick analysis of forest fragmentation using the new global deforestation map

In my last post I looked at some general patterns of deforestation in Central America as shown by the recent publication by Mathew Hansen and collaborators.

http://www.sciencemag.org/content/342/6160/850

The data are available for interactive browsing here.

http://earthenginepartners.appspot.com/science-2013-global-forest

The layers have been made public, and the authors anticipate widespread use of the information.
However it will not be possible to download the raster layers directly until January.

However, any comparatively small area can be analysed simply by taking a screenshot from the browser and geo registering the bitmap. Check your screen resolution. Landsat pixels are 28.5m by 28.5m so you cannot capture areas above 38 km by 21 km using a laptop.

Registering the screenshot to EPSG:3857 can be achieved using the gdal georegister plugin for QGIS. The simplest way to find control points is to take a screenshot of the Google base map and reference this to the same image shown by the open layers plugin. The control points can then be saved and used to georeference screenshots of overlying map layers providing the screenshots have exactly the same extent.

georeferncer

Take two screenshots from http://earthenginepartners.appspot.com/science-2013-global-forest. Use QGIS to register the underlying Google layer to the identical layer shown by the open layers plugin. Find 2 (preferably many more) points and save them to a file.

georeferncer1

Now, load the layer showing deforestation. Load the control points and run the georeferencer. Make sure that EPSG:3857 is used at all times.

Quick fragmentation analysis

Packages needed.

library(rgdal)
library(rgeos)
library(raster)

My screenshot was very hurriedly georefefereced using only two points so it does not line up perfectly. In order to load the screenshot use the raster package to produce a “brick”.

The code below loads the results directly from my dropbox so that you can try playing with the results in R. The layer is georeferenced in EPSG:3857. Distances are in meters.

load(url("http://dl.dropboxusercontent.com/u/2703650/Weblog/Defor/defor.rda"))

The raster brick is called r. The total area of the map can be calculated from its extent

totalarea <- (r@extent@xmax - r@extent@xmin) * (r@extent@ymax - r@extent@ymin)/10000
totalarea

unnamed-chunk-3

Now to look at the values in the green band.

plot(r[[2]])

unnamed-chunk-5

So, we can clump all values over 50 to find forest patches.

cl <- clump(r[[2]] > 50)
plot(cl)

unnamed-chunk-6
To produce a vector layer use the dilation and erosion trick. First extract the coordinates for the raster centroids and check for NA values.

id <- values(cl)
crds <- coordinates(cl)
crds <- crds[!is.na(id), ]
id <- id[!is.na(id)]
crds <- data.frame(id, crds)
coordinates(crds) <- ~x + y

Now use a buffer to join the points. After eroding the buffer we get polygons.

buf1 <- gBuffer(crds, width = 50, byid = T)
buf2 <- gUnaryUnion(buf1, id)
buf3 <- gBuffer(buf2, width = -50, byid = T

We may want to remove fragments below 5 ha.

frags <- buf3[gArea(buf3, byid = T) > 50000]
plot(frags, col = "darkgreen")
axis(1)
axis(2)
box()
grid()

unnamed-chunk-10

Percent of area forested

forestarea <- gArea(frags)/10000
100 * forestarea/totalarea

This gives 16.48% of the area (remember only patches over 5 hectareas are considered).

Now to calculate some patch metrics including area, perimeter and a shape index.

area <- gArea(frags, byid = T)
edge <- gBoundary(frags, byid = T)
perims <- gLength(edge, byid = T)
d <- data.frame(id = names(frags), area, perims)
d$shape <- d$perims/(2 * pi * sqrt(d$area/pi))
d$area <- d$area/10000
head(d)

## id area perims shape
## 1 1 22.806 4863 2.872
## 3 3 5082.178 560335 22.173
## 7 7 97.820 17788 5.074
## 9 9 6.386 2382 2.659
## 11 11 17.785 4751 3.178
## 12 12 75.351 14931 4.852

Notice how convoluted the large patches are, as indicated by the shape index.

hist(log10(d$area), col = "grey")

Finding core areas and edge effects

cores <- gBuffer(frags, width = -100, byid = TRUE)
plot(frags, col = "darkgreen")
plot(cores, col = "red", add = T)
axis(1)
axis(2)
box()
grid()

Percent of total area that is core

100 * gArea(cores)/10000/totalarea

So only 4% of the forest consists of core area.
Most of the area is within 100m of an edge. This makes the forest very vulnerable to fire.

This can be calculated for each patch.

c_area <- gArea(cores, byid = T)/10000
c_area <- data.frame(id = names(c_area), c_area)
d <- merge(d, c_area, all = T)
d$corepercent <- (d$c_area/d$area) * 100
d$edgepercent <- 100 - d$corepercent
d <- d[order(d$area, decreasing = T), ]
head(d)

##       id   area perims shape c_area corepercent edgepercent
## 245    3 5082.2 560335  2217 1547.5       30.45       69.55
## 170 2295 3012.4 209040  1074 1586.7       52.67       47.33
## 316  691 1998.7 265413  1675  409.5       20.49       79.51
## 318  707 1680.2 232236  1598  366.0       21.78       78.22
## 121 1995  770.1 123244  1253  154.5       20.06       79.94
## 4   1035  745.3 101270  1046  155.4       20.85       79.15

If this were a real study area the results could be analysed in more detail.

Connectivity analysis may be a next step.

The vector layer can of course be exported as shapefile in order to use QGIS.

frags<-SpatialPolygonsDataFrame(frags,data=d,match.ID=FALSE)
writeOGR(frags,"shapefiles","frags",driver="ESRI Shapefile")

frags

2 thoughts on “Quick analysis of forest fragmentation using the new global deforestation map

  1. Hi Duncan,

    Interesting … this shows recent “forest loss” in the areas of agriculture that had no forest for > 20 years in the Gallon Jug area.

    • At the bottom of my previous post (https://duncanjg.wordpress.com/2013/11/19/new-high-resolution-global-forest-change-map-1) I made a comment on the Yucatan. Hansen et al’s map show extensive, albeit disturbed “forest” in areas that we considered to be deforested when we worked on the classification for CE.
      This is to be expected Hansen’s group were aiming for consistency at a global scale. It would have been hard to find an algorithmic approach that is going to work everywhere. The cutoff point between scrubby secondary vegetation (or dry forest) and deforested areas is problematic.

      What would be very nice to see would be an interactive app in which users can correct and improve the accuracy of the information based on their knowledge.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s