Configuring Swift services and users in Keystone

Configuring our OpenStack Object Storage environment in Keystone follows the familiar pattern of defining the service, the endpoint, and creating an appropriate user in the service tenant. These details will then be used to configure the storage services later in the chapter.

In this environment, we are defining the address and ports of the proxy server. In the test environment, the proxy server's IP addresses are 192.168.100.209 (public) and 172.16.0.209 (internal/management). In production, this would be a Load-Balanced pool address.

Getting ready

Ensure that you are logged in to the controller node or an appropriate client that has access to the controller node to configure keystone. If this was created using the Vagrant environment, you can issue the following command:

vagrant ssh controller

How to do it...

Configure Keystone for use by Swift by carrying out the following steps:

  1. To do this, we use the Keystone client and configure it for use by an administrator by setting the following environment variables:
    export ENDPOINT=192.168.100.200
    export SERVICE_TOKEN=ADMIN
    export SERVICE_ENDPOINT=https://${ENDPOINT}:35357/v2.0
    export OS_KEY=/vagrant/cakey.pem
    export OS_CACERT=/vagrant/ca.pem
  2. We can now define the swift service in Keystone as follows:
    # Configure the OpenStack Object Storage Endpoint
    keystone service-create 
        --name swift 
        --type object-store 
        --description 'OpenStack Object Storage Service'
  3. We define the endpoint as follows. Here, we are setting public endpoint to be our public network, 192.168.100.0/24, and internal and admin URLs on the management network, 172.16.0.0/16:
    # Service Endpoint URLs
    SWIFT_SERVICE_ID=$(keystone service-list 
         | awk '/ swift / {print $2}')
    
    PUBLIC_URL="http://192.168.100.209:8080/v1/AUTH_$(tenant_id)s"
    ADMIN_URL="http://172.16.0.209:8080/v1"
    INTERNAL_URL=="http://172.16.0.209:8080/v1/AUTH_$(tenant_id)s"
    
    keystone endpoint-create --region RegionOne 
        --service_id $SWIFT_SERVICE_ID 
        --publicurl $PUBLIC_URL 
        --adminurl $ADMIN_URL 
        --internalurl $INTERNAL_URL
  4. With the endpoints configured to point to our OpenStack Object Storage server, we can now set up the swift user so that our proxy server can authenticate with the OpenStack identity server:
    # Get the service tenant ID
    SERVICE_TENANT_ID=$(keystone tenant-list 
        | awk '/ service / {print $2}')
    
    # Create the swift user with password swift
    keystone user-create 
        --name swift 
        --pass swift 
        --tenant_id $SERVICE_TENANT_ID 
        --email swift@localhost 
        --enabled true
    
    # Get the swift user id
    USER_ID=$(keystone user-list 
        | awk '/ swift / {print $2}')
    
    # Get the admin role id
    ROLE_ID=$(keystone role-list 
        | awk '/ admin / {print $2}')
    
    # Assign the swift user admin role in service tenant
    keystone user-role-add 
        --user $USER_ID 
        --role $ROLE_ID 
        --tenant_id $SERVICE_TENANT_ID

How it works...

To use Swift, we will be authenticating through Keystone and, as a result, Swift also needs entries in Keystone to function. We first define the service as we do for any service in OpenStack. In this case, Swift is the object-store type. Then, we define the endpoints. Swift will utilize two networks—a front-facing network labeled as public (referring to the network the API requests from the client would traverse) and an internal network for intercommunication between the services. Finally, we create the service tenant user. In this case, we are using swift as the username, and we are also setting swift as the password. In production, you would choose a much stronger, randomly generated password for this purpose. Like any other OpenStack service, this user is given the admin role in the service tenant.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.10.130