Importing Modis fire pixels into PostGIS

Modis data are “big data” in every respect. Both the temporal and spatial extent of the information obtained from Modis makes data processing rather challenging.

A quick overview of the Modis QuickFire data can be obtained by running the following code in R.

library(date)
for (yr in 2005:2012){
   startday<-as.numeric(mdy.date(1,1,yr))
     for (i in seq(1,370,by=10)){
     URLBase<-paste("http://rapidfire.sci.gsfc.nasa.gov/firemaps/&quot;,sep="")
     flnm<-paste("firemap.",yr,sprintf("%03.0f", i) ,"-",yr,sprintf("%03.0f", i+9), ".8192x4096.jpg", sep="")
    URL<-paste(URLBase,flnm,sep="")
    cat(URL)
    datlab<-date.ddmmmyy(startday+i)
    system(paste("wget",URL))
    label<-paste("convert ",flnm," -background orange -pointsize 500 -gravity Center label:'",datlab,"'       -append ",flnm,sep="")
   system(label)
}
}

A complete archive of all pixels recorded as having possible fire activity (quickfire data set) can be downloaded using.

wget - r ftp://fuoco.geog.umd.edu/modis/C5/mcd14ml/*  --ftp-user=fire --ftp-password=burnt

The files are strait ascii with the format given in the manual written by Louis Giglio that is available from the site.

The wget line will fill a folder with 562 files representing 3 GB of information representing suspected fires detected by Modis at a global scale dating from November 2000 to August 2012.

The data columns are

This can be simplified for many uses to the essential columns. Date (day) lat, lon, fire intensity and confidence.

So create a table to hold the key data.

CREATE TABLE modis_fire
(
  day date,
  lat double precision,
  lon double precision,
  frp double precision,
  conf double precision
)
WITH (
  OIDS=FALSE
);

Now to import it all through R Assuming that you are in the directory where the data is stored and the download has all been unzipped (gunzip *).

Start R and add the following.

library(Date)
library(RODBC)
con<-odbcConnect("modelling")## Change this for your ODBC connection
f<-function(x){
d<-read.table(x,head=T)
day<-as.Date(as.character(d$YYYYMMDD),format="%Y%m%d")
dd<-data.frame(day,lat=d$lat,lon=d$lon,FRP=d$FRP,conf=d$conf)
write.table(dd,"temp.csv",,row.names=F,sep=",",col.names=FALSE)
fl<-paste(getwd(),"/","temp.csv",sep="")
a <-paste ("COPY modis_fire FROM \'",fl,"\' DELIMITERS \',\' CSV;",sep="")
odbcQuery(con,a)

}

Now the function should run through and load all the data into your table by just “lappylying” it to a list of files in the directory.

aa<-dir()
lapply(aa,f)

However this table is very large (> 2GB). We can reduce it to the continent we are interested in and index it for North and South America.

create table modis_fire2 as
	select * from modis_fire where lon  - 160 and conf &gt; 70;

SELECT AddGeometryColumn( 'modis_fire2', 'geom', 4326, 'point', 2);
update modis_fire2 set geom = ST_SetSRID(st_point(lon,lat),4326);
CREATE INDEX modfire_ind ON modis_fire2 USING GIST (geom);
ALTER TABLE modis_fire2 ADD COLUMN id serial NOT NULL PRIMARY KEY;
CREATE INDEX modfiredate_ind ON modis_fire2 (day);
drop table modis_fire;

Now to look at all the fires in Chiapas since 2000.

There is clearly a lot more work to be done on this data, but it is now in a format that is more easily interrogated.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s