Monthly Archives: December 2013

installing the oce package for the R language

Several of the blog items have used the oce package.  The official version of this can be installed from within R by


and more up-to-date versions can be installed using the devtools package written by Hadley Wickham, which is itself installed with


after which installing the latest development version of oce is accomplished with

install_github('ocedata', 'dankelley', 'master')
install_github('oce', 'dankelley', 'develop')

Note that ocedata does not need to be updated frequently, as it only updated when new datasets are added to oce. The official version of oce is only updated every few months, but the branch named develop (used above) may be updated several times a day, if the author is adding new features or fixing bugs.

For more on oce, see the oce website on github.

R interface to WebTide (tidal prediction)

The previous posting explained how to install WebTide on an OSX machine. This one shows how to hook up to an installed WebTide database, so that R code can get tidal predictions.

The following code in the R language will produce a graph in which the top panel mimics the tidal-elevation graph produced by WebTide itself (see previous blog posting for comparison).

tStart <- as.POSIXct("2013-12-29 14:21:00", tz="UTC")
tEnd <- as.POSIXct("2014-01-13 15:21:00", tz="UTC")
time<-seq(tStart, tEnd, by=15, units="minutes")
prediction <- webtide("predict", lon=-65.06747, lat=45.36544, time=time)
A webtide prediction using the oce package of the R library

A webtide prediction using the oce package of the R library

One of the advantages of accessing the tidal prediction from within oce is to make it easier to undertake further analysis, e.g. a node nearer Halifax has a mixed tide, with the following illustrating in terms of velocity and a so-called progressive vector

p <- webtide("predict", node=14569)
par(mfrow=c(2,1), mar=c(3, 3, 1, 1), mgp=c(2, 0.7, 0))
plot(p$u, p$v, asp=1, type="o")
dt <- diff(as.numeric(p$time[1:2]))
x <- dt * cumsum(p$u)
y <- dt * cumsum(p$v)
plot(x, y, asp=1, type="o")
Velocity covariance and progressive vector diagram for a mixed tide.

Velocity covariance and progressive vector diagram for a mixed tide.

WebTide (tidal prediction) installation (OSX)

Abstract.This blog item explains how to install WebTide, a tidal prediction application, on an OSX machine. The work requires moderate expertise, being carried out in the console, and requiring that a C compiler be found on the host machine.

1. Introduction

WebTide comes with a friendly graphical user interface that makes it easy to get tidal predictions at various locations around the world. Anyone with an interest in oceanography is likely to find it a useful application. Although the interface is slightly quirky (particularly for map navigation), it only takes a few minutes to learn. (An upcoming blog entry will explain how to avoid the WebTide interface, using the oce R package to access the data that underly WebTide.)

2. Installation

  1. Download the “linux without Java VM” version from the Bedford Institute of Oceanography website, and store the results in the ~/Downloads directory.

  2. Open a terminal or console window, and type the following to enter the WebTide directory and start the process of installing WebTide.

    cd ~/Downloads

    After a moment, a dialog box will pop up, asking where WebTide is to be installed. To install a version just for you, supply /Users/USERNAME/WebTide, where USERNAME is your username. To install a version for all accounts on the computer, use the default, which is /usr/local/WebTide. A second dialog box addresses the issue of symlinks. Click the box indicating that these are not to be used. After a few moments, a final dialog box will appear, stating that the work has been completed. Click the button named “Finish”.

  3. Next comes the OSX-specific part of the work. You will need the gcc compiler to do this. It is available for free on the Apple website, as the Xcode development tools. (If you have been using the computer for science, it is quite likely you have already installed a compiler, so just try the steps below, before bothering with a re-install of Xcode.) In a console, type the following to enter the source directory (here assumed to be configured for a single user) and to compile two programs.

    cd ~/WebTide/Tidecor_src
    gcc -O2 -o tidecor webtidecor2.5.1.c -lm
    gcc -O2 -o constituentinterpolator constituentinterpolator1.1.c -lm

    In the two gcc lines shown above, it may be necessary to change the names of the ".c" files, to match whatever you find in this tidecor_src directory. You may ignore the warning messages that are likely to appear, but if errors appear, you will need to interrupt this tutorial to find out what is wrong. Possibly leaving a comment on this site would encourage another reader to offer a solution; possibly the scientists at the Bedford Institute of Oceanography (named on the website) could help.

  4. Finally, move these newly-compiled programs from the source directory into the bin directory, by typing

    mv tidecor ../bin
    mv constituentinterpolator ../bin

3. Using WebTide

If you installed WebTide to your home directory, launch WebTide by typing the following in a console window


and use the GUI from that point on. If you installed it to /usr/local/WebTide, type the following instead


The main WebTide window looks like the following.

A WebTide map, showing a marker in the Bay of Fundy (see also the sea-level elevation diagram)

A WebTide map, showing a marker in the Bay of Fundy (see also the sea-level elevation diagram)

There is little point in explaining the details of the menu and mouse actions here, since WebTide provides its own documentation. In most cases, the action will be to click the mouse on spot on the map, to set a so-called “marker” for a location at which calculations are to be done. The diagram shown here has a marker in the Bay of Fundy. Once a marker has been set, the user may wish to click a menu item (see at the right-hand side) to get a tidal prediction, as shown in the screenshot below.

A WebTide prediction of sea-level elevation

A WebTide prediction of sea-level elevation


After experimenting with calculations for what I eventually came to realize were analemma-based sundials (with shadow cast by a vertical pole), I remembered that the common sundial has a wedge as the shadow-maker. A bit of research told me that the wedge is called a gnomon. It is a right triangle with one vertex (the “centre” vertex, shall we say) having angle equal to the local latitude. If this wedge is placed upright on a horizontal plane with the centre vertex aligned south and the 90deg vertex aligned north, then the shadow will produce a line that indicates the hour of the day. This works throughout the year, with approximately quarter-hour adjustments being required through the seasons.

The R code given below the diagram creates an outline of the expected edge of the shadow of the gnomon. To illustrate the variation in angle through the year (which relates to the “equation of time”), colours are used to indicate four significant times during the year.

Printed at full scale, the diagram might be suitable for laying out the horizontal scale for a sundial. Naturally, readers must alter the latitude and longitude if they do not live in Halifax, Nova Scotia.

A few notes:

  1. The gnomon hypotenuse will point to the pole star (Polaris) when the apparatus is aligned properly towards the north.
  2. Calling the function with debug=1 will show dots along the radial lines. These are the shadows of virtual points lying along the hypotenuse of the gnomon, and provide a check against errors in the calculation (since they should lie along a line if the gnomon angle matches the latitude).
  3. Noon is not aligned with North because the longitude is not an even multiple of 15 degrees.
  4. The length of the shadow provides extra information, but here this information is not shown (the lengths are normalized in lines 35 to 37 of the code.)


## gnonom style sundial
if (!interactive())
    png("sundial_with_gnomon.png", width=7, height=6, unit="in", 
        res=200, pointsize=13)


sundial <- function(lon=-63.60, lat=44.65,
                    days=c("2014-03-20", "2014-06-20", "2014-09-23", "2014-12-21"),
                    keys=c("Spring equinox", "Summer solstice",
                           "Autumn equinox", "Winter solstice"),
    col <- 1:4
    glwd <- 8
    timezone <- floor(0.5 + lon / 15)
    L <- 1                           # horiz gnomon length (arbitrary)
    nhours <- 24
    first <- TRUE
    for (season in 1:4) {
        hours <- seq.POSIXt(as.POSIXct(days[season], tz="UTC"),
                            by="1 hour", length.out=nhours)
        for (hour in seq_along(hours)) {
            t <- hours[hour]
            tlocal <- t + 3600 * timezone
            sa <- sunAngle(t, lon=lon, lat=lat)
            gy <- seq(0, L, length.out=10)
            gx <- rep(0, length(gy))
            gz <- gy * tan(lat * pi / 180)
            R <- gz / tan(-sa$altitude * pi / 180) # radius of shadow
            theta <- (90 - sa$azimuth) * pi / 180
            par(mar=rep(2, 4))
            x <- gx + R * cos(theta)
            y <- gy + R * sin(theta)
            len <- max(sqrt(x^2 + y^2))
            x <- x / len * L
            y <- y / len * L
            if (sa$altitude > 0) {
                if (first) {
                    first <- FALSE
                    D <- L * 5
                    plot(x, y, type='n', pch=20, asp=1,
                         xlim=c(-L, L), ylim=c(-L/5, L),
                         axes=FALSE, col=col[season], xlab="", ylab="")
                    ## Draw gnomon as a gray bar
                    segments(0, 0, 0, L, lwd=glwd, col='gray')
                    abline(h=0, lwd=1.5*par('lwd'), lty='dotted')
                    abline(v=0, lwd=1.5*par('lwd'), lty='dotted')
                    mtext("South", side=1, at=0)
                    mtext("West", side=2, at=0)
                    mtext("North", side=3, at=0)
                    mtext("East", side=4, at=0)
                    legend("topright", lwd=glwd, col="gray",
                           legend=sprintf("%.3fE %.3fN", lon, lat))
                    legend("topleft", lwd=1, col=1:4, cex=3/4,
                    points(0, 0, pch=20, cex=2)
                    segments(0, 0, x, y, col=col[season])
                } else {
                    segments(0, 0, x, y, col=col[season])
                    if (debug)
                        points(x, y)
                if (season==2) {
                    xend <- tail(x, 1)
                    yend <- tail(y, 1)
                    text(1.05*xend, 1.05*yend, format(tlocal, "%H"))

if (!interactive())


Continuing on the theme of solar angles, the code given below produces an analemma diagram similar to that of Lynch (2012, figure 2).

loc <- list(lon=-0.0015, lat=51.4778)  # Greenwich Observatory
t <- seq.POSIXt(as.POSIXct("2014-01-01 12:00:00", tz="UTC"),
                as.POSIXct("2015-01-01 12:00:00", tz="UTC"),
                by="1 d")
sa <- sunAngle(t, lon=loc$lon, lat=loc$lat)
par(mar=c(3, 3, 1, 1), mgp=c(2, 0.7, 0)) # tighten margins
plot(sa$azimuth, sa$altitude, type='p', pch=20,
     xlab="Azimuth", ylab="Altitude")



  1. Peter Lynch, 2012. The equation of time and the analemma. Irish Math. Soc. Bulletin, 69:47-56. (

coastline in oce package (part II)

1. Introduction. As another posting illustrated, the existing coastline in oce is noticeably coarse on scales commonly used in coastal-ocean work. A better alternative is the 1:10m coastline provided at the Natural Earth website. That site also provides 1:50m and 1:100m versions, and this entry illustrates those as well, with an aim to deciding whether to include these in oce.

2. Methods. The files are “Existing” (what is in oce as of early December 2013), “Fine” (the 1:10m file), “Medium” (the 1:50m file) and “Coarse” (the 1:100m file). The latter three were downladed 2013 Dec and read into Oce with names coastlineWorldFine etc, and then timing and graphical tests were done.

3. Results.

3.1 Timing tests.. A world map was plotted to a PNG file (R file world.R, appendix 1), and it yielded the following times: Existing 6.3s, Fine 16.9s, Medium 0.63s, Coarse 0.14s. (For context, note that all four panels plot in under half a second with the x11 device commonly used for interactive work.)

3.2 File sizes. The corresponding size of the .rda files are: Existing 1.8Mb, Fine 3.0Mb, Medium 0.5Mb, and Coarse 0.1Mb.

3.3 Plot quality The image below is produced by R code given in Appendix 1.


At top-left is the existing coastline, which on this scale is very close in appearance to the Fine resolution case to its right, and the Medium resolution case below it. At bottom right is the Coarse case, which produces very similar overall features but misses many islands (see e.g. near Alaska and the Caribbean). Timing tests reveal the Existing coastline file to take 6.3s, Fine 17s, Medium 0.63s, and Coarse 0.14s. This argues against the use of Fine for interactive work on even continental scales, with Medium seeming to be the best choice at global scales, with its order of magnitude reduction in plotting time. (As a side note, speed improvements will be reflected in file sizes if PDF output is chosen.) As for the shelf scale, see the previous posting, which suggests the use of Fine at such scales.

4. Discussion and conclusion. These results argue for replacing the existing coastline with the three variants from Natural Earth. This is also a good time to prune other datasets. Separate tests (not shown here) suggest that Fine is preferable to the present dataset named coastlineMaritimes, so the latter can be dropped. Oce also contains a dataset named coastlineSLE that is mainly of interest to the author (and, like any localized dataset, can always be loaded for detailed work).

5. Plan for oce. Subject to feedback from other oce users, all three Natural Earth datasets will be incorporated into Oce. The existing coastline (named coastlineWorld) will be replaced with Medium. (Altering the name would break existing code.) The other datasets will be named coastlineFine and coastlineCoarse. The existing datasets for the Maritimes and SLE will be dropped. The changes will increase plot quality and speed, at the cost of a 1.7Mb increase in the size of the data package. The work will be done in a new branch called “coastline” that will hang off the branch called “develop”, and will be merged into the latter pending user approval and testing.

APPENDIX 1: The R code (world.R) used to make the image is given below. Readers should note that this will not work for them unless they first download the datasets, read the datasets with read.oce(), and save() the results in .rda files as named in the code.

if (!interactive())
par(mfrow=c(2,2), mar=rep(0, 4))
                    longitudelim=c(-80,10), latitudelim=c(0,120),
                    projection="orthographic", axes=FALSE,
                    orientation=c(45,-100,0), fill='grey'))
                    longitudelim=c(-80,10), latitudelim=c(0,120),
                    projection="orthographic", axes=FALSE,
                    longitudelim=c(-80,10), latitudelim=c(0,120),
                    projection="orthographic", axes=FALSE,
                    orientation=c(45,-100,0), fill='grey'))
                    longitudelim=c(-80,10), latitudelim=c(0,120),
                    projection="orthographic", axes=FALSE,
                    orientation=c(45,-100,0), fill='grey'))
if (!interactive())

Appendix 2: poll

coastline in oce package (part I)

The oce package has a world coastline file that is visibly crude on a scale suitable for continental-shelf work.  At the cost of about 1 Mbyte of storage, a candidate for a replacement is a 1:10million scale version downloaded from (full link below).

As illustrated below with plots near two oceanographic centres, this candidate provides noticeably more detail.



The code to make the graph is below. Readers wishing to reproduce this will need to download the coastline file named in line 2, from the naturalearthdata website (download link

NB. The link at the end of the previous paragraph downloads a 8.4 Mbyte data file that becomes a 3.0 Mbyte file in R format, which is to be compared with the 1.8 Mbyte file for the existing coastline. The existing coastline has 406694 points, while the 1:10m version has 552670 points (i.e. 30% more points); the mismatch between data-size ratio and file-size ratio perhaps has something to do with compression.

d <- read.oce("ne_10m_admin_0_countries.shp")

## Two panels illustrate two oceanographic centres.

## Left panel: Halifax NS, with grey for existing
## world coastline and blue for the 1:10m version
## under consideration.
plot(coastlineWorld, clon=-63.60, clat=44.65, span=200)
lines(d[['longitude']], d[['latitude']], col='blue', lwd=1.4)

## As left panel, but for Woods Hole MA.
plot(coastlineWorld, clon=-70.70, clat=41.65, span=200)
lines(d[['longitude']], d[['latitude']], col='blue', lwd=1.4)