Choosing metadata

While the build.xml and build.properties files are relatively static, the package.xml file has the possibility of changing every time you make a deploy. At it's simplest, the package.xml file can be as bare bones as this:

<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
  <types>
    <name>ApexClass</name>
    <members>*</members>
  </types>
  <types>
   <name>ApexPage</name>
    <members>*</members>
  </types>
  <version>31.0</version>
</Package>

This package.xml file specifies two metadata types: ApexClass and ApexPage. And,it directs the Ant migration tool to include all metadata of these two types in the deployment. Essentially, this package.xml affects all classes and pages. Note the package node's xmlns attribute. You'll need to specify this! The Package.xml files can specify any number of metadata types and is used for both retrieval and deployment. Thus, with a package.xml file like this one, when you make a retrieval call, you'd end up with a src/ folder containing two subfolders—classes, which contains all of your Apex class code, and pages, containing your visualforce page source. Likewise, if you made a deployment call with this package.xml file, you'd be deploying the contents of the src/classes and src/pages folders.

The Package.xml files don't have to contain such blanket requests for all files of a given type. Indeed, your package.xml can specify specific objects and metadata files. For instance consider the following example:

<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
  <types>
    <members>MyPageController</members>
    <members>TestFactory</members>
    <name>ApexClass</name>
  </types>
  <types>
    <members>Test_Page</members>
    <name>ApexPage</name>
  </types>
  <types>
    <members>AmazingObject__c</members>
    <members>Account</members>
    <name>CustomObject</name>
  </types>
  <version>31.0</version>
</Package>

This package.xml file specifies specific files and custom objects to retrieve or deploy. Additionally, it specifies the API version these should be retrieved or deployed with. This highlights an additional feature of the Ant toolkit. It's able to retrieve and deploy not just code, but almost the entire metadata of your org including custom objects, triggers, classes, page layouts, and at specific, API versions. Sadly, the Ant migration tool does have its limitations. Chief among them is its reliance on the Salesforce metadata API. This API has a number of incompatible metadata types that are listed here: https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_unsupported_types.htm.

Alas, I have become destructiveChanges.xml, destroyer of orgs

The Ant migration toolkit has one other crucial feature that is especially well suited for automated processes, and one that is unique to Ant and Force.com deployments: Destructive changes. Destructive changes require their own XML file. Syntactically identical to a package.xml file, it defines the metadata components to be deleted from the system. When deleting metadata, it can be important to define the order of destructive changes in relationship to additions. Because of this, destructive changes can be defined in any of three filenames: destructiveChanges.xml, destructiveChangesPre.xml, or destructiveChangesPost.xml. The pre and post options allow you to specify destructive changes to be made before or after the addition of new or updated metadata. If you do not specify pre or post in the filename, the system defaults to doing destructive changes before additions and updates. If you want to only do a destructive change, your package.xml, must contain no metadata components but must specify an API version.

Despite the verbosity of XML and unsupported metadata types, the ant migration toolkit is the most versatile and tested of the deployment solutions available to us. Additionally, because of the machine-readable and writable nature of XML, the ant migration toolkit is the de facto standard for doing deployments from an automated continuous integration system. The Ant toolkit and its various configuration files are an essential tool for Salesforce1 developers, and those that truly master it unlock the ability to not only deploy code, but remove it as well. This helps orgs maintain a clear understanding of what code is active—inactive code should be removed!

The Force.com IDE deployments

Several years ago, Salesforce released an Eclipse IDE plugin that, among other things, enabled developers to retrieve, deploy, and test code from within the IDE. This was later prepackaged with Eclipse as the Force.com IDE. Under the covers, the plugin exercises the same metadata API that the ant migration toolkit uses. The key differences between the IDE plugin and using the ant migration toolkit boil down this—the IDE takes care of creating the package.xml file from choices you make in the IDE's graphical user interface. The steps for deploying from within the IDE may not be intuitive at first. First, you select the metadata files you wish to deploy in the package explorer panel to the left. Once selected, click on the deploy button on the toolbar and walk through the wizard window that appears, as shown in the following screenshot:

The Force.com IDE deployments

The wizard will prompt you to provide credentials for the target org and then ask you if you'd like to create a backup before deploying. Once you've answered those questions, it will log in to the target org and compare the metadata to present you with the preceding window. Here, you'll need to check each of the individual boxes to include them in the deployment. Also, ensure that the action specified makes sense for your org. If you think you're adding a class, but the action lists it as an update, it's likely a coworker beaten you to the punch. When you're satisfied that you understand the changes that are about to happen, click on next to start the deployment.

If you're intimidated by editing complex XML files, the Force.com IDE offers a friendlier, guided deploy system that is capable of both additive and destructive changes. This comes with a price, however, as the Force.com IDE plugin requires you to use older versions of Eclipse. On the other hand, if you're familiar with Eclipse, this may be the perfect tool for you.

In the last few years, other IDEs or IDE plugins have been created. First among them is the open source MavensMate for Sublime Text and Atom. Like the Force.com IDE, MavensMate utilizes the metadata API to do deployments. Unfortunately, MavensMate doesn't currently expose a graphical interface for doing destructive changes. The following screenshot shows the deploy options to validate the deployement:

The Force.com IDE deployments

The process of deploying is a bit different with MavensMate. Rather than preselecting the metadata that you want to deploy, you're first presented with a rich set of options for your deployment. By default, deploys are set to Validate only, and Rollback on Error. Validation is always a great idea before finally deploying and using rollback on error, meaning that you won't end up with a half completed deploy. MavensMate uses tabs to separate parts of the deployment process. You can use the second tag to establish and edit org connections that are then selected (including multiple targets at a time) on the first tab at the top. Once you've specified the target orgs, use the metadata tab to select the metadata components that you want to deploy. Take a look at the following screenshot as an example:

The Force.com IDE deployments

I've selected all of the ApexPage, ApexClass, and ApexComponent metadata types. However, if I wanted, I could deselect individual files within those types. After arduously setting up the deployment, you can relax on the Arcade tab with some classic 8-bit arcade games. Don't worry, as soon as the deployment is validated or completed, you'll be taken to the results.

Change is good

While deployments based on Ant and IDE are the backbone and time-tested processes for deploying Salesforce metadata, Salesforce has added a new cloud-based deployment solution known as Change Sets. Change Sets, as their name implies, are bundles of metadata created or updated in a source org that are made available inside another org. At its core, Change Sets work by establishing trust relationships between orgs and passing changes between them. These are referred to as Deployment Connections and can be found at deploy | deployment settings | Setup.

Change is good

Note the Upload Authorization Direction section. The green arrow indicates that a relationship exists and is authorized, in this case, to move metadata from the Conversion sandbox to production. Editing a particular record allows you to establish relationships and set their direction. Keep in mind, however, that Change Sets can only be passed between related orgs, and only those sandboxes created from a production instance are eligible to be related. This means you cannot utilize Change Sets to deploy metadata from one production org to another.

Change Sets can work in both directions, for example, from Sandbox to Production as well as Production to Sandboxes. This makes them ideal for passing declarative metadata such as page layouts, custom fields, and objects from production orgs to sandboxes without refreshing the entire sandbox. To keep busy coworkers from accidently overwriting metadata, Change Sets have both outbound and inbound formulations. Outbound Change Sets are Change Sets of metadata from this org, destined for a different org. Conversely, inbound Change Sets contain metadata from a different org for potential inclusion in the current org. I say potential, because Change Sets are a two-part deployment process. You must first create an outbound Change Set, then log into the target org, and validate and deploy the inbound Change Set in the destination org. Creating a Change Set can be tedious and there are a few things to keep in mind. Once a Change Set has been uploaded, it cannot be modified. You can, however, clone it and modify its contents. Because of this, it's useful to include a version number in your change set's name. Take the following screenshot as an example for editing the Change Set:

Change is good

As you can see here, I've included v1.0 in the Name field of the Change Set so that I can keep track of which one I'm validating in the target org.

Adding components to a Change Set can get tedious, but it's pretty simple. The UI presents you with a drop-down menu of component types. Once selected, you'll see a list view of metadata matching that component type, as shown here:

Change is good

Once you've added all your metadata components, you can upload the Change Set to any org you've established a relationship with. Once uploaded, it can take a few minutes for your outbound Change Set to migrate to an inbound Change Set in the target org. Helpfully, you'll get an e-mail message letting you know when the Change Set is available in the target org. Note that while you can create a Change Set with only metadata components, you almost always want to deploy the changes along with the profiles that will be affected by those changes. As tedious as creating large Change Sets can be, it's nothing compared to manually editing the field level security for the 30 new fields you just deployed for 5 profiles. Just under the component listing is the listing of profiles that will be deployed in the Change Set. Ensure that you always populate it.

Once your Change Set is available in the target org, you must validate and deploy it. Validation consists of mock deploying the new or updated metadata components and running all of your unit tests. While it's possible to trigger a direct deployment of a Change Set, validating your Change Set before deploying has its benefits. Not only are you prepared for any errors before they potentially affect your production org, it allows for nearly instantaneous deployments that bypass running all tests if you've previously validated the change set in the last 24 hours. If your orgs tests take hours to run, validation of a change set allows you to validate overnight, and deploy after a glorious cup of coffee in the morning. The following screenshot gives the status of the deployment:

Change is good

Overall, Change Sets are a fantastic tool for the bidirectional deployment of metadata from one org to another. This is especially true for highly regulated environments as Change Sets maintain a log of who deployed what, when. However, they are not without their limitations. First and foremost is the inability to make destructive changes via a Change Set. You can add new metadata and deploy updated metadata via Change Sets, but whatever is currently there cannot be removed with a Change Set. Additionally, you need to ensure that your Change Set includes every last metadata component your code relies on. Deploying an object and some workflow rules? Don't forget to include the custom objects' individual fields. They're not automatically included when you select an object to deploy, and failing to include a field can cause the entire Change Set to fail. In a very real sense, Change Sets are very atomic, and consequently, they can be very frustrating to work with if you're used to IDEs and the Ant migration toolkit.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.252.204