# Visualizing weather data in Google Earth

Note: This article describes the assessment work for the course Scientific Visualization lectured at Grenoble Institute of Technology (winter semester 2015/2016).

### Introduction

The goal of this project was to visualize data from Meteo France in Google Earth using several methods presented in the course. Meteo France provides publicly available weather data such as temperatures or atmospheric pressures at several locations. The work consists of the following four main steps:

• read and import the scattered data (longitude, latitude, temperature, pressure, and wind velocity)
• interpolate the scattered data in order to evaluate data at arbitrary positions, using the Shepard’s method and the Hardy multiquadrics
• calculate colormap images and isocontours from the interpolated data
• export the colormap images and the isocontours in a KML (Keyhole Markup Language) file in order to visualize the data in Google Earth

In several next sections, you can find the results of this task and also a detailed description of individual steps of the implementation which lead to the final visualization.

### Solution

I have carried out this project using C++ programming language. First, I am attaching several test visualizations that have been exported to Google Earth in order to verify the correctness of the code output. You can find the result in Figure 1. Fig. 1 Colormaps and isocontours of temperature (left) and wind velocity (right) visualized in Google Earth.

The data processing pipeline which goes from the initial data reading to the final visualization is hardcoded in function run_visualization that is shown below:

By default, the size of grid is set to 1025 nodes in both directions, which leads to the size of generated colormap 1024 x 1024. Then the Shepard’s interpolation with parameter p = 2 is performed. For the colormap, the predefined color table jet with 64 colors is used. After that, 6 isolines are computed, and both, colormap and isolines, are exported to the KML file. One can change the arguments, uncomment one of the methods for interpolation, but the order of the data processing pipeline has to be kept as is.

For the code compilation, one can use attached bash script compile.sh which executes prepared Makefile. Due to the usage of features introduced in C++14, the minimum required version of GCC compiler is 4.8. Compiled application takes the filename of the input data as an argument. Both, the application and the input file, have to be located in the same directory.

### Implementation details

Tip: The source code of this work can be found in this GitHub repository.

The main idea how to deal with several steps of the implementation was to make a single class for the data processing, which includes data structures for scattered and uniform data as well as all methods beginning with reading of the input data and ending with the required visualization. One can find the listings of this class below, individual methods and data structures will be described later.

The main advantage of this approach is simpler and faster way to access the data during the interpolation of scattered data to the uniform grid. The drawback is obviously inability to use the code for solving other similar problems as well as more difficult development of the code extensions.

#### Reading input data

The first step is to gather and parse the data. Meteo France provides publicly available data including atmospheric parameters measured at several France meteorological stations in the CSV (Comma-Separated Values) file format. For the purposes of the project, we are interested in temperature, atmospheric pressure and wind velocity. Since all the data undergo the same process before the final visualization, the data are loaded from separate CSV files, which have to contain three columns. The first two are dedicated for latitude and longitude of the site and the last one for the value of the measured parameter.

Scattered data are parsed into the following data structure:

The size value stands for the number of rows in the input file (number of the stations), the boundary stores minimum and maximum coordinates, and the array of three vectors values is dedicated for the data itself.

The responsibility for the data parsing takes the following method:

This method reads and imports the data from a given file, finds the boundaries and stores the name of the file.

#### Creating uniform grid

Since the input data are loaded, we need to create a two-dimensional uniform grid to be able to perform interpolation. Similarly as before, we need a data structure for uniform data:

This data structure contains the size of the grid, which is now a pair of integers (number of nodes in x and y directions), spacing between two nodes in both directions, minimum and maximum coordinates, the matrix of cells and the matrix of nodes.

The matrices of cells and nodes use special datastructures. The first datastructure is cell:

This class contains ID, four pointers to the nodes assigned to the cell, average of the node values and RGBA color space, which will take a value determined by the average (in order to calculate the colormap). In addition, this class includes the method for computing the average value of nodes assigned to the cell as well as several methods for selecting a segment of this cell in order to compute isocontours. The segment data structure contains only coordinates of the starting and ending points:

The second data type is used to describe nodes. This class is much simpler than the cell class, it contains only ID and three floating point numbers: latitude, longitude and the value of measured parameter:

The second step in the data processing pipeline is to create a two-dimensional uniform grid. For this purpose, there is a method in the main class:

This method takes the size of the grid in both directions as an argument. The grid size significantly affects the quality and the computational time of the interpolation as well as the resolution of the computed colormap and the smoothness of the isocontours. The location of the grid is done automatically in accordance with the boundaries of the scattered data. This method also initializes the cells and nodes.

#### Performing interpolation

The crucial step is the interpolation of scattered data in order to evaluate measured parameters at the grid points. In accordance with the project specifications, two following methods have been used:

##### Shepard’s method

The principle of finding an interpolated value of scattered data $f_{i} = f\left( \vec{x}_{i} \right)$ for $i \in \left\lbrace 1, \ldots, N \right\rbrace$ at a given point $\vec{x}$ using Shepard’s method, is based on constructing an interpolating function

$\small{F \left( \vec{x} \right) = \sum_{i = 1}^{N} \omega_{i} \left( \vec{x} \right) f_{i}},$

where the weighting function $\omega_{i} \left( \vec{x} \right)$ is defined by Shepard as follows ,

$\small{\omega_{i} \left( \vec{x} \right) = \frac{1}{\mathrm{d} \left( \vec{x}, \: \vec{x}_{i}\right) ^{p}} \cdot \frac{1}{\sum \limits_{j = 1}^{N} \frac{1}{\mathrm{d} \left( \vec{x}, \: \vec{x}_{j}\right) ^{p}}}},$ $\small{\mathrm{d}\left( \vec{x}, \: \vec{x}_{i} \right) = \lVert \vec{x} - \vec{x}_{i} \rVert}$

and $p$ is a positive real number, called the power parameter.

The implementation of this method is attached below. In the first step, the weighting function $\omega_{i} \left( \vec{x} \right)$ is calculated for each grid point. Afterwards, the measured parameter is evaluated there.

##### Hardy multiquadrics

The second method which has been implemented is called Hardy multiquadrics. Now, we are looking for the interpolating function in the following form , $$\small{F\left( \vec{x} \right) = \sum_{i = 1}^{N} \alpha_{i} h_{i} \left( \vec{x} \right)},$$ where $\alpha_{i}$ are unknown coefficients and $$\small{h_{i} \left( \vec{x} \right) = \sqrt{R + \mathrm{d} \left( \vec{x}, \: \vec{x}_{i} \right)^{2} }}.$$ The input argument R is a nonzero shape parameter. The coefficients $\alpha_{i}$ are calculated by solving the system of equations given by the conditions $$\small{F\left( \vec{x}_{i} \right) = f_{i}}.$$ It was proven by  that for distinct data, this system of equations is always solvable.

The implementation of the Hardy’s method has been carried out as follows:

The system of linear equations is solved using LU decomposition from the Eigen C++ library. Then the function $h_{i} \left( \vec{x} \right)$ is calculated at each grid point and the measured parameter is evaluated there.

#### Calculating colormaps

Since we have interpolated values of measured parameter at each grid point, one is able to compute a colormap. The idea is to calculate an average value of the four nodes assigned to each cell. Then the maximum and minimum of average values are determined and the interval formed by these two numbers is divided into several parts corresponding to the number of colors in the color table. Afterwards, each cell will get the RGB value from the color table according to its average value. The implementation has been done as follows:

#### Calculating isocontours

Similarly as before, the method for computing isocontours takes a number of isocontours as an argument. Then the maximum and minimum of values assigned to grid points are calculated, and the interval, which these two numbers form, is divided by the number of isocontours we want to calculate. The isocontours are computed in accordance with these thresholds using so called marching squares algorithm. My implementation of the marching squares algorithm is shown below:

#### Exporting data

In order to visualize the calculated data in Google Earth, it is necessary to export them in the KML file format. Regarding the colormap, first we need to export the generated image in PNG format. For this reason, the LodePNG C++ libriary has been used. Then we need to generate a KML file with the specified path to the colormap image and the coordinates, where this image should appear. These coordinates are determined by boundaries that we computed before. For the isocontours, we need to export separate line segments determined by its starting and ending points. These lines we have computed before as segments. Altogether, they create a whole isoline.

There exists a KML library written in C++ called libkml, but it is deprecated. Therefore, I decided to write my own methods for exporting the data into a KML file. One can find it in the source code.

 Shepard, D., “A two-dimensional interpolation function for irregularly-spaced data”, Proceedings of the 1968 ACM National Conference, pp. 517 - 524.

 Hardy, R., “Multiquadric equations of topography and other irregular surfaces”, Journal of Geophysical Research 76, pp. 1905 - 1915 (1971).

 Micchelli, C., “Interpolation of scattered data: Distance matrices and conditionally positive definite functions”, Constructive Approximation 2, pp. 11 – 22 (1986).