Throughout this book, we have seen how data-driven design is highly efficient for adding dynamicity and multiplicity to test automation. Its usage was revealed in two contexts: when binding either a regular test step or a verification step to a data table column. Making use of the power of this feature in other contexts can also be useful.
Chapter 4, Maintaining Test Elements has focused on the default expression used by Test Studio to locate interface elements. It has elaborated on some strategies to add more robustness to these expressions by basing them to the known nonchanging attributes. Out of these strategies, we have seen a built-in Test Studio feature called chained expressions which was demonstrated through an example which attempts to locate a cell inside a data grid. The following screenshot previews this example and describes the solution which involves hardcoding values at the leaf level of the chained expression:
Notice how the TextContent
operand takes a text value equal to 2611
. We will see how this restriction denies us the possibility of binding this test to a data source. So let's assume we have the following test case to automate, which eventually will be tied up to a data source, by performing the following steps:
Func-1_FileCompare_Equal_Successful_In1.trx
entry (which is contained under the first column).The expected result is that the Details window contains the text, In1.
In a data-driven context, the test case that follows will be tied to a data source by performing the following steps:
Func-1_FileCompare_Equal_Successful_In2.trx
entry (which is contained under the second column).The expected result is that the Details window contains the text, In2
In the third step of the procedure, if we intend to follow the same strategy of chained expressions to implement the find expression of the cell to click, we will hardcode the last TextContent
operand to the cell content. Therefore, we will lose the flexibility of varying the destination cell at runtime by varying the cell text value. In this section, we will see how to make data-driven chained expressions.
Inside Test Studio, add a WPF test under the Data-Driven Tests
folder and name it Func-16_History_ContextMenu_Successful
and then execute the following steps:
TextContent
. A combobox will be enabled for the value field. Expand it and choose File Name as shown in the following screenshot. The File Name option corresponds to one of the columns created earlier inside the local data table. For each iteration, the TextContent
operand will be assigned the value of the active row. During execution, Test Studio will try to locate this text inside any data grid row.Now that we have transformed the process of finding a cell into a dynamic activity, one problem still hinders the successful execution of the test. The test verification step has a hardcoded value as well. Here comes the role of the File Details column. Therefore, to convert the data-driven step, perform the following steps:
Run the test and notice how the clicks on the data grid cells occur in different places during each run. The overall passing status of the tests also asserts that the verifications were successfully updated with the changing data.
Other than the regular usage of variables, Test Studio particularly allows them to be used as information carriers among the several test steps. On one hand, we have seen how to control them through the IDE using the binding property, and on the other hand through code by using the SetExtractedValue
and GetExtractedValue
methods. Are the variables only confined to the execution scope of a single test?
While creating most of the tests, we made use of the custom LogResult
method showing its result after the log messages. While applying the reusability scheme, you notice that this method is replicated all over the tests and therefore constitutes a maintainability threat. So you start taking actions to abstract its definition away from all the tests. Theoretically, you would want to create a test A
which is alone responsible for holding the definition of this method and hence receiving alone any future changes concerning its functionality. Afterwards, you want to replace all the internal coded steps currently implementing the LogResult
method with a call to test A
. However, how would you vary the string passed as a parameter to test A
and thereafter to the LogResult
method inside?
Test Studio offers flexibility in variables creation as we have seen in the first data-driven example of this chapter. During the compilation stage, it allows the usage of uninitialized variables either through test steps binding or code. So during test crafting, there is no validation with regards to the variable's existence. However exceptions will be thrown at runtime if the variable is not initialized by that time. This section makes use of this flexibility to solve the problem at hand.
For this example, we will need two WPF tests called Func-17_PrintSubmissionResult_Successful
and Op-Common_Log
respectively.
For the Func-17_PrintSubmissionResult_Successful
test, perform the following steps:
DataBindVariableName
field from CompareFilesTextblock
to logString
.
logString
is the name of the variable to be used as the parameter to the embedded test. For Op-Common_Log,
add the following coded step:
var text = "String to print is: " + (string)GetExtractedValue("logString") + Environment.NewLine; using (System.IO.FileStream fileStream = new System.IO.FileStream(@"C:File Comparer FilesLog.txt", System.IO.FileMode.OpenOrCreate, System.IO.FileAccess.ReadWrite, System.IO.FileShare.Read)) { fileStream.Seek(0, System.IO.SeekOrigin.End); byte[] buffer = new byte[text.Length]; buffer = Encoding.UTF8.GetBytes(text); fileStream.Write(buffer, 0, buffer.Length); fileStream.Close(); }
The first statement in the preceding code uses the
getExtractedValue
method to extract the value of the logString
that is passed at runtime by the parent test. The remaining code in the method opens the logfile and writes the content of the variable to it.
Go back to the Func-17_PrintSubmissionResult_Successful
test and using the Test as Step feature, add a call to the Op-Common_Log
test. Make sure that the added step is the last.
Run the test and after it finishes execution, open the logfile referred to in the
Op-Common_Log
test. Notice how the inner test successfully receives the tab name and prints it to the file as follows:
String to print is: The files resulted in equal comparison!
18.191.233.205