Data processing

As you write more and more complex programs, it's the server.R file that will become the largest because this is where all the data processing and output goes, and is even where some of the functions that handle advanced UI features live. Let's look at the chunks in order and talk about the work carried out in each section.

The first chunk of code looks like the following:

library(tidyverse)
library(gapminder)
library(leaflet)
library(ggmap)

You can see the packages that we need being loaded at the top. We load the tidyverse for access to things such as dplyr and ggplot, for munging data and plotting, respectively. The gapminder package is used to load a subset of the gapminder data. It has been very kindly munged and placed online in the form of a CRAN package by Jenny Bryan. We will talk more about the data that it makes available later. The leaflet package—which, as we mentioned before, is used for drawing maps—is loaded next. Finally, the ggmap package is loaded. This will be used to convert country names to latitude and longitude for plotting by leaflet. 

The next thing that we do is munge or, if we already munged it, load the data, as shown in the following code:

if(!file.exists("geocodedData.Rdata")){

mapData = gapminder %>%
mutate(country2 = as.character(country)) %>%
group_by(country) %>%
slice(1) %>%
mutate_geocode(country2, source = "dsk") %>%
select(-country2)

mapData = left_join(gapminder, mapData) %>%
group_by(country) %>%
fill(lon) %>%
fill(lat)

save(mapData, file = "geocodedData.Rdata")

} else {

load("geocodedData.Rdata")
}

I'm not going to distract from Shiny by talking too much about this code. Essentially, it checks to see whether the data is already in the relevant directory (which it will be after you've run it once). If it is, it just loads the data. If not, it prepares it by producing a smaller dataset with one instance of each country and geocoding it (that is, turning it into latitude and longitude), then combining it with the whole dataset (to put all the years for each country back in), and filling in the resulting missing geocodings using the incredibly useful fill() function of dplyr. Having done all that, it saves the data for next time (since querying the API for all 142 countries takes quite a long time). Don't worry too much if you can't follow this code at the moment; it's really there just to get the data onto your computer in a simple way.

It's worth noting that data instructions—and indeed any other code—that appear above function(input, output) {} are executed once when the application starts and then serve all instances of the Shiny application. This makes no difference when you're developing on a local machine, since the code will start fresh every time you run it, but once you move your code to a server, it can make a big difference, since the code is only run once for all the users and application instances you have (until you restart the Shiny server, which manages connections to your applications on a server, of course). Therefore, any intensive data or processing calls are best kept in this section to avoid your users having to wait a long time for their application to load.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.162.14