Increasingly, successful websites are not completely standalone. To engage existing users and find new users, integration with social networking is a must. To provide store locators or other location-aware services, using geolocation and mapping services is essential. It doesn’t stop there: more and more organizations are realizing that providing an API helps expand their service and makes it more useful.
In this chapter, we’ll be discussing the two most common integration needs: social media and geolocation.
Social media is a great way to promote your product or service: if that’s your goal, the ability for your users to easily share your content on social media sites is essential. As I write this, the dominant social networking services are Facebook, Twitter, Instagram, and YouTube. Sites like Pinterest and Flickr have their place, but they are usually a little more audience specific (for example, if your website is about DIY crafting, you would absolutely want to support Pinterest). Laugh if you will, but I predict that MySpace will make a comeback. Its site redesign is inspired, and it’s worth noting that MySpace is built on Node.
Most social media integration is a frontend affair. You reference the appropriate JavaScript files in your page, and it enables both incoming content (the top three stories from your Facebook page, for example) and outgoing content (the ability to tweet about the page you’re on, for example). While this often represents the easiest path to social media integration, it comes at a cost: I’ve seen page load times double or even triple thanks to the additional HTTP requests. If page performance is important to you (and it should be, especially for mobile users), you should carefully consider how you integrate social media.
That said, the code that enables a Facebook Like button or a Tweet button leverages in-browser cookies to post on the user’s behalf. Moving this functionality to the backend would be difficult (and, in some instances, impossible). So if that is functionality you need, linking in the appropriate third-party library is your best option, even though it can affect your page performance.
Let’s say that we want to mention the top 10 most recent tweets that contain the hashtags #Oregon #travel. We could use a frontend component to do this, but it will involve additional HTTP requests. Furthermore, if we do it on the backend, we have the option of caching the tweets for performance. Also, if we do the searching on the backend, we can “blacklist” uncharitable tweets, which would be more difficult on the frontend.
Twitter, like Facebook, allows you to create apps. It’s something of a misnomer: a Twitter app doesn’t do anything (in the traditional sense). It’s more like a set of credentials that you can use to create the actual app on your site. The easiest and most portable way to access the Twitter API is to create an app and use it to get access tokens.
Create a Twitter app by going to http://dev.twitter.com. Make sure you’re logged on, and click your username in the navigation bar, and then Apps. Click “Create an app,” and follow the instructions. Once you have an application, you’ll see that you now have a consumer API key and an API secret key. The API secret key, as the name indicates, should be kept secret: do not ever include this in responses sent to the client. If a third party were to get access to this secret, they could make requests on behalf of your application, which could have unfortunate consequences for you if the use is malicious.
Now that we have a consumer API key and secret key, we can communicate with the Twitter REST API.
To keep our code tidy, we’ll put our Twitter code in a module called lib/twitter.js:
const
https
=
require
(
'https'
)
module
.
exports
=
twitterOptions
=>
{
return
{
search
:
async
(
query
,
count
)
=>
{
// TODO
}
}
}
This pattern should be starting to become familiar to you. Our module exports a function into which the caller passes a configuration object. What’s returned is an object containing methods. In this way, we can add functionality to our module. Currently, we’re only providing a search
method. Here’s how we will be using the library:
const
=
require
(
'./lib/twitter'
)({
consumerApiKey
:
credentials
.
.
consumerApiKey
,
apiSecretKey
:
credentials
.
.
apiSecretKey
,
})
const
tweets
=
await
.
search
(
'#Oregon #travel'
,
10
)
// tweets will be in result.statuses
(Don’t forget to put a twitter
property with consumerApiKey
and
apiSecretKey
in your .credentials.development.json file.)
Before we implement the search
method, we must provide some functionality to authenticate ourselves to Twitter. The process is simple: we use HTTPS to request an access token based on our consumer key and consumer secret. We only have to do this once: currently, Twitter does not expire access tokens (though you can invalidate them manually). Since we don’t want to request an access token every time, we’ll cache the access token so we can reuse it.
The way we’ve constructed our module allows us to create private functionality that’s not available to the caller. Specifically, the only thing that’s available to the caller is module.exports
. Since we’re returning a function, only that function is available to the caller. Calling that function results in an object, and only the properties of that object are available to the caller. So we’re going to create a variable accessToken
, which we’ll use to cache our access token, and a getAccessToken
function that will get the access token. The first time it’s called, it will make a Twitter API request to get the access token. Subsequent calls will simply return the value of accessToken
:
const
https
=
require
(
'https'
)
module
.
exports
=
function
(
twitterOptions
)
{
// this variable will be invisible outside of this module
let
accessToken
=
null
// this function will be invisible outside of this module
const
getAccessToken
=
async
()
=>
{
if
(
accessToken
)
return
accessToken
// TODO: get access token
}
return
{
search
:
async
(
query
,
count
)
=>
{
// TODO
}
}
}
We mark getAccessToken
as async because we may have to make an HTTP request to the Twitter API (if there isn’t a cached token). Now that we’ve established the basic structure, let’s implement getAccessToken
:
const
getAccessToken
=
async
()
=>
{
if
(
accessToken
)
return
accessToken
const
bearerToken
=
Buffer
(
encodeURIComponent
(
twitterOptions
.
consumerApiKey
)
+
':'
+
encodeURIComponent
(
twitterOptions
.
apiSecretKey
)
).
toString
(
'base64'
)
const
options
=
{
hostname
:
'api.twitter.com'
,
port
:
443
,
method
:
'POST'
,
path
:
'/oauth2/token?grant_type=client_credentials'
,
headers
:
{
'Authorization'
:
'Basic '
+
bearerToken
,
},
}
return
new
Promise
((
resolve
,
reject
)
=>
https
.
request
(
options
,
res
=>
{
let
data
=
''
res
.
on
(
'data'
,
chunk
=>
data
+=
chunk
)
res
.
on
(
'end'
,
()
=>
{
const
auth
=
JSON
.
parse
(
data
)
if
(
auth
.
token_type
!==
'bearer'
)
return
reject
(
new
Error
(
'Twitter auth failed.'
))
accessToken
=
auth
.
access_token
return
resolve
(
accessToken
)
})
}).
end
()
)
}
The details of constructing this call are available on Twitter’s developer documentation page for application-only authentication. Basically, we have to construct a bearer token that’s a base64-encoded combination of the consumer key and consumer secret. Once we’ve constructed that token, we can call the /oauth2/token
API with the Authorization
header containing the bearer token to request an access token. Note that we must use HTTPS: if you attempt to make this call over HTTP, you are transmitting your secret key unencrypted, and the API will simply hang up on you.
Once we receive the full response from the API (we listen for the end
event of the response stream), we can parse the JSON, make sure the token type is bearer
, and be on our merry way. We cache the access token, and then invoke the callback.
Now that we have a mechanism for obtaining an access token, we can make API calls. So let’s implement our search
method:
search
:
async
(
query
,
count
)
=>
{
const
accessToken
=
await
getAccessToken
()
const
options
=
{
hostname
:
'api.twitter.com'
,
port
:
443
,
method
:
'GET'
,
path
:
'/1.1/search/tweets.json?q='
+
encodeURIComponent
(
query
)
+
'&count='
+
(
count
||
10
),
headers
:
{
'Authorization'
:
'Bearer '
+
accessToken
,
},
}
return
new
Promise
((
resolve
,
reject
)
=>
https
.
request
(
options
,
res
=>
{
let
data
=
''
res
.
on
(
'data'
,
chunk
=>
data
+=
chunk
)
res
.
on
(
'end'
,
()
=>
resolve
(
JSON
.
parse
(
data
)))
}).
end
()
)
},
Now we have the ability to search tweets…so how do we display them on our site? Largely, it’s up to you, but there are some things to consider. Twitter has an interest in making sure its data is used in a manner consistent with the brand. To that end, it does have display requirements, which employ functional elements you must include to display a tweet.
There is some wiggle room in the requirements (for example, if you’re displaying on a device that doesn’t support images, you don’t have to include the avatar image), but for the most part, you’ll end up with something that looks very much like an embedded tweet. It’s a lot of work, and there is a way around it…but it involves linking to Twitter’s widget library, which is the very HTTP request we’re trying to avoid.
If you need to display tweets, your best bet is to use the Twitter widget library, even though it incurs an extra HTTP request. For more complicated use of the API, you’ll still have to access the REST API from the backend, so you will probably end up using the REST API in concert with frontend scripts.
Let’s continue with our example: we want to display the top 10 tweets that mention the hashtags #Oregon #travel. We’ll use the REST API to search for the tweets and the Twitter widget library to display them. Since we don’t want to run up against usage limits (or slow down our server), we’ll cache the tweets and the HTML to display them for 15 minutes.
We’ll start by modifying our Twitter library to include a method embed
, which gets the HTML to display a tweet. Note we’re using an npm library querystringify
to construct a querystring from an object, so don’t forget to npm install querystringify
and import it (const qs = require( ‘querystringify ’)
), and then add the following function to the export of lib/twitter.js:
embed
:
async
(
url
,
options
=
{})
=>
{
options
.
url
=
url
const
accessToken
=
await
getAccessToken
()
const
requestOptions
=
{
hostname
:
'api.twitter.com'
,
port
:
443
,
method
:
'GET'
,
path
:
'/1.1/statuses/oembed.json?'
+
qs
.
stringify
(
options
),
headers
:
{
'Authorization'
:
'Bearer '
+
accessToken
,
},
}
return
new
Promise
((
resolve
,
reject
)
=>
https
.
request
(
requestOptions
,
res
=>
{
let
data
=
''
res
.
on
(
'data'
,
chunk
=>
data
+=
chunk
)
res
.
on
(
'end'
,
()
=>
resolve
(
JSON
.
parse
(
data
)))
}).
end
()
)
},
Now we’re ready to search for, and cache, tweets. In our main application file, create the following function getTopTweets
:
const
twitterClient
=
createTwitterClient
(
credentials
.
)
const
getTopTweets
=
((
twitterClient
,
search
)
=>
{
const
topTweets
=
{
count
:
10
,
lastRefreshed
:
0
,
refreshInterval
:
15
*
60
*
1000
,
tweets
:
[],
}
return
async
()
=>
{
if
(
Date
.
now
()
>
topTweets
.
lastRefreshed
+
topTweets
.
refreshInterval
)
{
const
tweets
=
await
twitterClient
.
search
(
'#Oregon #travel'
,
topTweets
.
count
)
const
formattedTweets
=
await
Promise
.
all
(
tweets
.
statuses
.
map
(
async
({
id_str
,
user
})
=>
{
const
url
=
`https://twitter.com/
${
user
.
id_str
}
/statuses/
${
id_str
}
`
const
embeddedTweet
=
await
twitterClient
.
embed
(
url
,
{
omit_script
:
1
})
return
embeddedTweet
.
html
})
)
topTweets
.
lastRefreshed
=
Date
.
now
()
topTweets
.
tweets
=
formattedTweets
}
return
topTweets
.
tweets
}
})(
twitterClient
,
'#Oregon #travel'
)
The essence of the getTopTweets
function is to not just search for tweets with a specified hashtag, but to cache those tweets for some reasonable period of time. Note that we created an immediately invoked function expression, or IIFE: that’s because we want the topTweets
cache safely inside a closure so it can’t be messed with. The asynchronous function that’s returned from the IIFE refreshes the cache if necessary, and then returns the contents of the cache.
Lastly, let’s create a view, views/social.handlebars
as a home for our social media presence (which right now, includes only our selected tweets):
<h2>
Oregon Travel in Social Media</h2>
<script
id=
"twitter-wjs"
type=
"text/javascript"
async
defer
src=
"//platform.twitter.com/widgets.js"
></script>
{{{tweets}}}
And a route to handle it:
app
.
get
(
'/social'
,
async
(
req
,
res
)
=>
{
res
.
render
(
'social'
,
{
tweets
:
await
getTopTweets
()
})
})
Note that we reference an external script, Twitter’s widgets.js
. This is the script that will format and give functionality to the embedded tweets on your page. By default, the oembed
API will include a reference to this script in the HTML, but since we’re displaying 10 tweets, that would reference that script nine more times than necessary! So recall that, when we called the oembed
API, we passed in the option { omit_script: 1 }
. Since we did that, we have to provide it somewhere, which we did in the view. Go ahead and try removing the script from the view. You’ll still see the tweets, but they won’t have any formatting or functionality.
Now we have a nice social media feed! Let’s turn our attention to another important application: displaying maps in our application.
Geocoding refers to the process of taking a street address or place name (Bletchley Park, Sherwood Drive, Bletchley, Milton Keynes MK3 6EB, UK) and converting it to geographic coordinates (latitude 51.9976597, longitude –0.7406863). If your application is going to be doing any kind of geographic calculation—distances or directions—or displaying a map, then you’ll need geographic coordinates.
You may be used to seeing geographic coordinates specified in degrees, minutes, and seconds (DMS). Geocoding APIs and mapping services use a single floating-point number for latitude and longitude. If you need to display DMS coordinates, see this wikipedia article.
Both Google and Bing offer excellent REST services for geocoding. We’ll be using Google for our example, but the Bing service is very similar.
Without attaching a billing account to your Google account, your geocoding requests will be limited to one a day, which will make for a very slow testing cycle indeed! Whenever possible in this book, I’ve tried to avoid recommending services you couldn’t at least use in a development capacity for free, and I did try some free geocoding services, and found a significant enough gulf in usability that I continue to recommend Google geocoding. However, as I write this, the cost of development-volume geocoding with Google is free: you receive a $200 monthly credit with your account, and you would have to make 40,000 requests to exhaust that! If you want to follow along with this chapter, go to your Google console, choose Billing from the main menu, and enter your billing information.
Once you’ve set up billing, you’ll need an API key for Google’s geocoding API. Go to the console, select your project from the navigation bar, and then click on APIs. If the geocoding API isn’t in your list of enabled APIs, locate it in the list of additional APIs and add it. Most of the Google APIs share the same API credentials, so click on the navigation menu in the upper left, and go back to your dashboard. Click on Credentials, and create a new API key if you don’t have an appropriate one already. Note that API keys can be restricted to prevent abuse, so make sure your API key can be used from your application. If you need one for developing, you can restrict the key by IP address, and choose your IP address (if you don’t know what it is, you can just ask Google, “What’s my IP address?”).
Once you have an API key, add it to .credentials.development.json:
"google"
:
{
"apiKey"
:
"<YOUR API KEY>"
}
Then create a module lib/geocode.js:
const
https
=
require
(
'https'
)
const
{
credentials
}
=
require
(
'../config'
)
module
.
exports
=
async
query
=>
{
const
options
=
{
hostname
:
'maps.googleapis.com'
,
path
:
'/maps/api/geocode/json?address='
+
encodeURIComponent
(
query
)
+
'&key='
+
credentials
.
.
apiKey
,
}
return
new
Promise
((
resolve
,
reject
)
=>
https
.
request
(
options
,
res
=>
{
let
data
=
''
res
.
on
(
'data'
,
chunk
=>
data
+=
chunk
)
res
.
on
(
'end'
,
()
=>
{
data
=
JSON
.
parse
(
data
)
if
(
!
data
.
results
.
length
)
return
reject
(
new
Error
(
`no results for "
${
query
}
"`
))
resolve
(
data
.
results
[
0
].
geometry
.
location
)
})
}).
end
()
)
}
Now we have a function that will contact the Google API to geocode an address. If it can’t find an address (or fails for any other reason), an error will be returned. The API can return multiple addresses. For example, if you search for “10 Main Street” without specifying a city, state, or postal code, it will return dozens of results. Our implementation simply picks the first one. The API returns a lot of information, but all we’re currently interested in are the coordinates. You could easily modify this interface to return more information. See the Google geocoding API documentation for more information about the data the API returns.
The Google geocoding API currently has a monthly usage limit, but you pay $0.005 per geocoding request. So if you made a million requests in any given month, you’d have a $5,000 bill from Google…so there is a probably practical limit for you!
If you’re worried about runaway charges—which could happen if you accidentally leave a service running, or if a bad actor gets access to your credentials—you can add a budget and configure alerts to notify you as you approach them. Go to your Google developer console, and choose “Budgets & alerts” from the Billing menu.
At the time of writing, Google limits you to 5,000 requests per 100 seconds to prevent abuse, which would be difficult to exceed. Google’s API also requires that if you use a map on your website, you use Google Maps. That is, if you’re using Google’s service to geocode your data, you can’t turn around and display that information on a Bing map without violating the terms of service. Generally, this is not an onerous restriction, as you probably wouldn’t be doing geocoding unless you intended to display locations on a map. However, if you like Bing’s maps better than Google’s, or vice versa, you should be mindful of the terms of service and use the appropriate API.
We have a nice database of vacation packages around Oregon, and we might decide we want to display a map with pins showing where the various vacations are, and this is where geocoding comes in.
We already have vacation data in the database, and each vacation has a location search string that will work with geocoding, but we don’t yet have coordinates.
The question now is when and how do we do the geocoding? Broadly speaking, we have three options:
Geocode when we add new vacations to the database. This is probably a great option when we add an admin interface to the system that allows vendors to dynamically add vacations to the database. Since we’re stopping short of this functionality, however, we’ll discard this option.
Geocode as necessary when retrieving vacations from the database. This approach would do a check every time we get vacations from the database: if any of them have missing coordinates, we would geocode them. This option sounds appealing, and is probably the easiest of the three, but it has some big disadvantages that make it unsuitable. The first is performance: if you add a thousand new vacations to the database, the first person to look at the vacations list is going to have to wait for all of those geocoding requests to succeed and get written to the database. Furthermore, one can imagine a situation where a load testing suite adds a thousand vacations to the database and then performs a thousand requests. Since they all run concurrently, each of those thousand requests results in a thousand geocoding requests because the data hasn’t been written to the database yet…resulting in a million geocoding requests and a $5,000 bill from Google! So let’s cross this one off the list.
Have a script to find vacations with missing coordinate date, and geocode those. This approach offers the best solution for our current situation. For development purposes, we’re populating the vacation database once, and we don’t yet have an admin interface for adding new vacations. Furthermore, if we do decide to add an admin interface later, this approach isn’t incompatible with that: as a matter of fact, we could just run this process after adding a new vacation, and it would work.
First, we need to add a way to update an existing vacation in db.js (we’ll also add a method to close the database connection, which will come in handy in scripts):
module
.
exports
=
{
//...
updateVacationBySku
:
async
(
sku
,
data
)
=>
Vacation
.
updateOne
({
sku
},
data
),
close
:
()
=>
mongoose
.
connection
.
close
(),
}
Then we can write a script db-geocode.js:
const
db
=
require
(
'./db'
)
const
geocode
=
require
(
'./lib/geocode'
)
const
geocodeVacations
=
async
()
=>
{
const
vacations
=
await
db
.
getVacations
()
const
vacationsWithoutCoordinates
=
vacations
.
filter
(({
location
})
=>
!
location
.
coordinates
||
typeof
location
.
coordinates
.
lat
!==
'number'
)
console
.
log
(
`geocoding
${
vacationsWithoutCoordinates
.
length
}
`
+
`of
${
vacations
.
length
}
vacations:`
)
return
Promise
.
all
(
vacationsWithoutCoordinates
.
map
(
async
({
sku
,
location
})
=>
{
const
{
search
}
=
location
if
(
typeof
search
!==
'string'
||
!
/w/
.
test
(
search
))
return
console
.
log
(
` SKU
${
sku
}
FAILED: does not have location.search`
)
try
{
const
coordinates
=
await
geocode
(
search
)
await
db
.
updateVacationBySku
(
sku
,
{
location
:
{
search
,
coordinates
}
})
console
.
log
(
` SKU
${
sku
}
SUCCEEDED:
${
coordinates
.
lat
}
,
${
coordinates
.
lng
}
`
)
}
catch
(
err
)
{
return
console
.
log
(
` SKU {sku} FAILED:
${
err
.
message
}
`
)
}
}))
}
geocodeVacations
()
.
then
(()
=>
{
console
.
log
(
'DONE'
)
db
.
close
()
})
.
catch
(
err
=>
{
console
.
error
(
'ERROR: '
+
err
.
message
)
db
.
close
()
})
When you run the script (node db-geocode.js
), you should see that all of your vacations have been successfully geocoded! Now that we have that information, let’s learn how to display it on a map….
While displaying vacations on a map really falls under “frontend” work, it would be very disappointing to get this far and not see the fruits of our labor. So we’re going to take a slight departure from the backend focus of this book, and see how to display our newly geocoded dealers on a map.
We already created a Google API key to do our geocoding, but we still need to enable the maps API. Go to your Google console, click on APIs, and find Maps JavaScript API and enable it if it isn’t already.
Now we can create a view to display our vacations map, views/vacations-map.handlebars. We’ll start with just displaying the map, and work on adding vacations next:
<div
id=
"map"
style=
"width: 100%; height: 60vh;"
></div>
<script>
let
map
=
undefined
async
function
initMap
()
{
map
=
new
.
maps
.
Map
(
document
.
getElementById
(
'map'
),
{
// approximate geographic center of oregon
center
:
{
lat
:
44.0978126
,
lng
:
-
120.0963654
},
// this zoom level covers most of the state
zoom
:
7
,
})
}
</script>
<script
src=
"https://maps.googleapis.com/maps/api/js?key={{googleApiKey}}&callback=initMap"
async
defer
></script>
Now it’s time to put some pins on the map corresponding with our vacations. In Chapter 15, we created an API endpoint /api/vacations
, which will now include geocoded data. We’ll use that endpoint to get our vacations, and put pins on the map. Modify the initMap
function in views/vacations-map.handlebars.js:
async
function
initMap
()
{
map
=
new
.
maps
.
Map
(
document
.
getElementById
(
'map'
),
{
// approximate geographic center of oregon
center
:
{
lat
:
44.0978126
,
lng
:
-
120.0963654
},
// this zoom level covers most of the state
zoom
:
7
,
})
const
vacations
=
await
fetch
(
'/api/vacations'
).
then
(
res
=>
res
.
json
())
vacations
.
forEach
(({
name
,
location
})
=>
{
const
marker
=
new
.
maps
.
Marker
({
position
:
location
.
coordinates
,
map
,
title
:
name
,
})
})
}
Now we have a map showing where all our vacations are! There are a lot of ways we could improve this page: probably the best place to start would be to linking the markers with the vacation detail page, so you could click on a marker and it would take you to the vacation info page. We could also implement custom markers or tooltips: the Google Maps API has a lot of features, and you can learn about them from the official Google documentation.
Remember our “current weather” widget from Chapter 7? Let’s get that hooked up with some live data! We’ll be using the US National Weather Service (NWS) API to get forecast information. As with our Twitter integration, and our use of geocoding, we’ll be caching the forecast to prevent passing every hit to our website on to NWS (which might get us blacklisted if our website gets popular). Create a file called lib/weather.js:
const
https
=
require
(
'https'
)
const
{
URL
}
=
require
(
'url'
)
const
_fetch
=
url
=>
new
Promise
((
resolve
,
reject
)
=>
{
const
{
hostname
,
pathname
,
search
}
=
new
URL
(
url
)
const
options
=
{
hostname
,
path
:
pathname
+
search
,
headers
:
{
'User-Agent'
:
'Meadowlark Travel'
},
}
https
.
get
(
options
,
res
=>
{
let
data
=
''
res
.
on
(
'data'
,
chunk
=>
data
+=
chunk
)
res
.
on
(
'end'
,
()
=>
resolve
(
JSON
.
parse
(
data
)))
}).
end
()
})
module
.
exports
=
locations
=>
{
const
cache
=
{
refreshFrequency
:
15
*
60
*
1000
,
lastRefreshed
:
0
,
refreshing
:
false
,
forecasts
:
locations
.
map
(
location
=>
({
location
})),
}
const
updateForecast
=
async
forecast
=>
{
if
(
!
forecast
.
url
)
{
const
{
lat
,
lng
}
=
forecast
.
location
.
coordinates
const
path
=
`/points/
${
lat
.
toFixed
(
4
)
}
,
${
lng
.
toFixed
(
4
)
}
`
const
points
=
await
_fetch
(
'https://api.weather.gov'
+
path
)
forecast
.
url
=
points
.
properties
.
forecast
}
const
{
properties
:
{
periods
}
}
=
await
_fetch
(
forecast
.
url
)
const
currentPeriod
=
periods
[
0
]
Object
.
assign
(
forecast
,
{
iconUrl
:
currentPeriod
.
icon
,
weather
:
currentPeriod
.
shortForecast
,
temp
:
currentPeriod
.
temperature
+
' '
+
currentPeriod
.
temperatureUnit
,
})
return
forecast
}
const
getForecasts
=
async
()
=>
{
if
(
Date
.
now
()
>
cache
.
lastRefreshed
+
cache
.
refreshFrequency
)
{
console
.
log
(
'updating cache'
)
cache
.
refreshing
=
true
cache
.
forecasts
=
await
Promise
.
all
(
cache
.
forecasts
.
map
(
updateForecast
))
cache
.
refreshing
=
false
}
return
cache
.
forecasts
}
return
getForecasts
}
You’ll notice that we got tired of using Node’s built-in https
library
directly, and instead created a utility function _fetch
to make our
weather functionality a little more readable. One thing that might jump
out at you is that we’re setting the User-Agent
header to Meadowlark
Travel
. This is a quirk of the NWS weather API: it requires a string for
the User-Agent
. They state that they will eventually replace this with
an API key, but for now we just need to provide a value here.
Getting weather data from the NWS API is a two-part affair here. There’s an API endpoint called points
that takes a latitude and longitude (with exactly four decimal digits) and returns information about that location…including the appropriate URL from which to get a forecast. Once we have that URL for any given set of coordinates, we don’t need to fetch it again. All we need to do is call that URL to get the updated forecast.
Note that a lot more data is returned from the forecast than we’re using; we could get a lot more sophisticated with this feature. In particular, the forecast URL returns an array of periods, with the first element being the current period (for example, “afternoon” or “evening”). It follows up with periods stretching into the next week. Feel free to look at the data in the periods
array to see the kind of data that’s available to you.
One detail worth pointing out is that we have a boolean property in our cache called refreshing
. This is necessary since updating the cache takes a finite amount of time, and is done asynchronously. If multiple requests come in before the first cache refresh completes, they will all kick off the work of refreshing the cache. It won’t hurt anything, exactly, but you will be making more API calls than are strictly necessary. This boolean variable is just a flag to any additional requests to say, “We’re working on it.”
We’ve designed this to be a drop-in replacement for the dummy function we created back in Chapter 7. All we have to do is open lib/middleware/weather.js and replace the getWeatherData
function:
const
weatherData
=
require
(
'../weather'
)
const
getWeatherData
=
weatherData
([
{
name
:
'Portland'
,
coordinates
:
{
lat
:
45.5154586
,
lng
:
-
122.6793461
},
},
{
name
:
'Bend'
,
coordinates
:
{
lat
:
44.0581728
,
lng
:
-
121.3153096
},
},
{
name
:
'Manzanita'
,
coordinates
:
{
lat
:
45.7184398
,
lng
:
-
123.9351354
},
},
])
We’ve really only scratched the surface of what can be done with third-party API integration. Everywhere you look, new APIs are popping up, offering every kind of data imaginable (even the City of Portland is now making a lot of public data available through REST APIs). While it would be impossible to cover even a small percentage of the APIs available to you, this chapter has covered the fundamentals you’ll need to know to use these APIs: http.request
, https.request
, and parsing JSON.
We now have a lot of knowledge under our belt. We’ve covered a lot of ground! What happens when things go wrong, though? In the next chapter, we’ll be discussing debugging techniques to help us when things don’t work out as we expect.
3.135.213.214