Monday, 19 December 2016

Segmenting SpatialPolygons

Both SpatialPolygons and SpatialPolygons consist of list of Polygons. Each of these Polygons in turn consist of vertices, i.e., the points where the edges of the polygon meet. When you transform a SpatialPolygons(DataFrame) to a different projection (e.g., from long-lat coordinates to UTM) with the spTransform-function, actually only the vertices are transformed (and not the lines connecting them). This means that also in the new projection straight lines are drawn between the vertices, whereas in reality they should be curved (due to the quasi-spherical shape of our world). For most shapes this isn't much of an issue. But when your shape covers a large area, you could run into serious problems.

This problem can be solved by adding more vertices to the polygons. In other words, add new vertices by interpolating between the existing vertices. If you do this before re-projecting, the new projection will be a lot more accurate. I tried to find an existing package in R that would do this, but I could not find any (please leave a comment if you did find an alternative). Therefore, I've written a small function to take care of this myself, which I will demonstrate in this post.

The source code of this function and the accompanying example is given below. This code is also used to produced the graphical illustration below. Note how the red line crosses the islands Northeast of Scotland in the original projection. Compare this with the new projection of the red lines with and without segmentation. Also note that the red lines in the new projection are straight when we don't apply segmentation, and they are curved (as they are supposed to be) when we do apply segmentation.

The segmentation is achieved relatively simple by looping all the lists of Polygons with the lapply function and then looping each line segment with the apply function (see R script below). The new vertices are derived via linear interpolation using the approx function. You could go for something more fancy like spline interpolation, but linear will do just fine in this example. As I've implemented all this in the function named ‘SPSegmentation’, you can simply call it on any SpatialPolygons or SpatialPolygonsDataFrame object you would like. With some modifications, you could also use this function of SpatialLines and SpatialLinesDataFrame objects.

In case you want to achieve the opposite (i.e., reduce the number of vertices in a SpatialPolygons(DataFrame)), you can use the gSimplify function from the rgeos package. This uses a Ramer–Douglas–Peucker algorithm to simplify the shape, which could save you some computing time, but your shape will also become less detailed as a result.

Saturday, 26 November 2016

ProTrackR: chiptunes in R (part three)

It has been a while but I've made some small progress on the ProTrackR package. Many bugs are fixed and some functionality is added. This is making tracking in R more and more easy and fun. As if developing a package for handling ProTracker modules isn't the nerdiest thing ever, I found a way to top it off. What if I combine the ProTrackR package with a Shiny interface? Well, you'll get a basic online tracker.

For more information you can also read parts one and two on ProTrackR.

Friday, 25 November 2016

Fixing a function

Sometimes R functions are not always doing exactly what you want. In that case you could try to look up the source code for that function paste it into your code and make your modifications there. But sometimes it takes only a small tweak and you want to keep your code as clean as possible. For me this was the case for the vuong test in the pscl package. This function only prints the output, but does not return it. This can easily be fixed by converting the function into text with the ‘deparse’ function. Then the ‘print’ statement can be replaced by a ‘return’ statement by using ‘gsub’. Now the modified text can be turned into a function again by calling ‘eval’ and ‘parse’. With this hack the new function will return the output I need. The code below shows you all described in this post. Go ahead and mutate functions just like that...

Friday, 28 October 2016

Openstreetmap relation, please be a SpatialPolygons object

The sp package is the package when it comes to handling spatial data in R. Getting to know how this package work can take some time, but it's well worth it. It can be a pain to import spatial data from for instance www.opentreetmap.org.

The other day for instance, I required an outline of Lake IJssel. How hard could it be to get that from openstreetmap? Well, getting the data is not all that hard, but converting it into something workable really got me pulling my hair out. In order to save future frustrations, I decided to share my solution with you.

The first step is to get the relation id number from openstreetmap. This can be easily obtained by searching the website for ‘Lake IJssel’. After clicking the first hit that represents a water body, you will find the relation id.

The second step is to download the data in R using this relation id. This can be done with the get_osm function of the osmar package. This data can be converted into a SpatialLines object using the as_sp function from the osmar package. However, I don't want a SpatialLines object; I want it to be a SpatialPolygons object. And that is the tricky bit, because there is no package available (that I know of) that can do this directly.

So this is the trick that I applied. First you need to join the lines in the SpatialLines object, using the gLineMerge function from the rgeos package (it took me a while to figure this one out!). Then convert it into a PolySet using the SpatialLines2PolySet function from the maptools package. Only then can it be converted into a SpatialPolygons object using the PolySet2SpatialPolygons function from the maptools package. If anyone knows of an easier way, pleas let me know in the comments.

So, there you have it. A SpatialPolygons object of Lake IJssel, that is actually useful for my purpose. The code below will do what I've described in this post; feel free to try it yourself! Note that it may not work on all openstreetmap data...

Monday, 18 July 2016

2D particle tracking in R, part 2: kernel density

In my previous post, I showed how particle movement can be simulated in the marine environment. Now that we have tracked the particle movement, how can we use this information to translate this to something more meaningful? This can be done by determining the particle density on the map.

The most simple way would be to lay a grid over the particles and simply count the number of particles in each grid cells. However, I would like to go for something a little bit complicated that gives some smoother results. I will use kernel density estimates. This means that I will consider the particles as a density function, were the position of the particle gives the highest density for that particle. You can get a density map of all particles by simply adding up the probabilities of the density functions of each individual particle. Fortunately, the KernSmooth package will take care of all this.

Remember that in the original simulation, I released 5,000 particles within a specific time-frame. With the kernel density estimated, I can estimate the number of particles per square kilometer at any time and location. This will become even more interesting if these particles represent something in the real world, let's say oil. If we know how much cubic meter of oil is represented by each particle, the estimates can be transformed in to the thickness of the oil slick. Keep in mind that this is just an initial test simulation that does not include all processes in order to properly simulate oil spills (e.g., oil evaporation, degradation and dispersion is not simulated).

Here's a video created with ffmpeg of the particle density movement over time.

And here's the fully commented script used to create the frames for the video.

Friday, 15 July 2016

2D particle tracking in R

How hard is it nowadays to simulate particle movement in the sea and track them in R? Well I discovered that it is not all that difficult. Within a couple of hours I had a simple simulation model running in the North Sea. All you need are time dependent water currents and wind data, both of which can be obtained from marine.copernicus.eu.

The main steps that you will have to follow:

  • load the water currents and wind data
  • specify particle release details
  • specify simulation details
  • loop all simulation time steps and for each step do the following with the particles that are released:
    • for each particle determine the local water current and wind vector at the current time index
    • transform the particle coordinates to a projection in meters
    • move the particles with the water current (m/s) multiplied by the time step duration (s)
    • move the particles with the wind speed (m/s) multiplied by the time step duration (s) and a drift factor (between 0 and 1)
    • move the particles in random directions (to simulate diffusion)
    • transform the particle coordinates back to the original projection
    • save particle information every nth simulation step, where n is any positive integer value

Below you will find the script that does all this (each of the steps above are marked in the script). In this simulation I simulate the release of 5,000 particles with a constant release rate during 5 days and followed them for a couple of days. I found the trickiest bit not the simulation itself, but reading the the NetCDF files such that it could be used in a convenient manner. I create a video of this simulation with ffmpeg:

So what's the practical use of all this? Well obviously you can simulate the trajectory of any floating materials, such as oil. However, you have to keep in mind that the simulation is only as good as the underpinning data and parameterisation. For instance, the water current data used here is not that accurate near estuaries. Of course you can always do multiple runs of your simulation with different parameter settings (or make it even a Monte Carlo simulation). Also, here I only simulate in 2D. But there is nothing that would stop you from modifying it to include the third dimension. What also should be added is some detection that particles have beached. But that should not be very difficult when you have coastline data.

Why in R? Well, commercial particle tracking software is generally expensive. Whereas free software are often rigid and don't allow the user to include custom modifications. In R the simulation is free, and you can take the exact implementation into our own hands... Hopefully, this post will help you get started with your own simulations. Good luck!

Here's the script:

Edit: if your interested, you might want to check out the follow-up post on particle density estimates of this simulation.