Using the custom service discovery

To see for ourselves how a custom service discovery behaves, we'll rely on our test environment. The custom-sd binary, which recreates the Consul discovery integration as an example of a custom service discovery, is deployed alongside Prometheus and is ready to be used. Together with the Consul deployment, we have all the required components in the test environment to see how everything fits together.

custom-sd can be built on a machine with a Go development environment set up by issuing the following command: go get github.com/prometheus/prometheus/documentation/examples/custom-sd/adapter-usage.

First, we need to ensure that we are connected to the prometheus instance. We can use the following command:

vagrant ssh prometheus

We can then proceed to change the Prometheus configuration to use file_sd as our integration. For this, we must replace the scrape job configured to use consul_sd with a new one. To make things easier, we placed a configuration file with this change already made in /etc/prometheus/. To use it, you just need to replace the current configuration with the new one:

vagrant@prometheus:~$ sudo mv /etc/prometheus/prometheus_file_sd.yml /etc/prometheus/prometheus.yml

The scrape job we are interested in is as follows:

- job_name: 'file_sd'
file_sd_configs:
- files:
- custom_file_sd.json

To make Prometheus aware of these changes, we must reload it:

vagrant@prometheus:~$ sudo systemctl reload prometheus

We should also make sure that the Consul server has the configuration for consul-exporter, which we added previously. If, by any chance, you missed that step, you may add it now by simply running the following code:

vagrant@prometheus:~$ curl --request PUT 
--data @/vagrant/chapter12/configs/consul_exporter/payload.json
http://consul:8500/v1/agent/service/register

If we take a look in the Prometheus web interface, we will see something similar to the following:

12.14: Prometheus /service-discovery endpoint without any file_sd targets

We're now ready to try out the custom-sd application. We'll need to specify the Consul API address and the path to the output file, which the Prometheus server is configured to read from. The following command will take care of that, and also ensure that the right user is creating the file, so that the Prometheus process is able to access it:

vagrant@prometheus:~$ sudo -u prometheus -- custom-sd --output.file="/etc/prometheus/custom_file_sd.json" --listen.address="consul:8500"

We now have the custom service discovery running. If we go back to the web interface of Prometheus in the /service-discovery endpoint, we'll be able to see the discovered target:

12.15: Prometheus /service-discovery endpoint depicting the discovered target

We can also inspect the file that was created by our custom-sd, and validate its contents, as follows (the output has been made compact for brevity):

vagrant@prometheus:~$ sudo cat /etc/prometheus/custom_file_sd.json 
[
{
"targets": ["consul:9107"],
"labels": {
"__address__": "consul:9107",
"__meta_consul_address": "192.168.42.11",
"__meta_consul_network_segment": "",
"__meta_consul_node": "consul",
"__meta_consul_service_address": "consul",
"__meta_consul_service_id": "consul-exporter01",
"__meta_consul_service_port": "9107",
"__meta_consul_tags": ",consul,exporter,prometheus,"
}}]

And that's it! You now have a custom service discovery up and running, fully integrated with Prometheus using the file-based service discovery mechanism. A more serious deployment would have the custom-sd service running as a daemon. If you're more comfortable with a scripting language, you could choose to write a service discovery script that produces the discovery file and exits, in which case running it as a cron job would be an option. As a last suggestion, you could have your configuration management software produce the discovery file dynamically on a schedule.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.137.187.233