Automation Framework


High-Level Automation Framework Information
At a very high level the framework components can be divided into two categories software part and documentation part. Here is my list of components – you can add your item if anything is missing.
--------------------------------------------
All software code part
--------------------------------------------
Supporting libraries for Logging, error handling, Execution management
1. Tool specific code
2. Setup and configuration scripts
3. Test management code
4. Test data and Test data management code
5. Platform/OS specific scripts
--------------------------------------------
Non software part
--------------------------------------------
Folder structure
Documents
i. Coding guidelines
ii. Procedure for creating scripts
iii. Planning, design and review procedures
iv. Approach for automation
v. Approach for Testing automation code
vi. Automation documentation - how scripts work
vii. Setup and implementation procedures
viii. Defect tracking procedure for Automation scripts
ix. Source control procedures
x. Project plan
xi. Templates for effort estimation
xii. Template for test case categorization
xiii. Template for ROI calculation
Choosing a test automation framework ?
A test automation framework is a set of assumptions, concepts, and practices that provide support for automated software testing. This article describes and demonstrates five basic frameworks.

Basing an automated testing effort on using only a capture tool such as HP QTP to record and play back test cases has its drawbacks. Running complex and powerful tests is time consuming and expensive when using only a capture tool. Because these tests are created ad hoc, their functionality can be difficult to track and reproduce, and they can be costly to maintain.

A better choice for an automated testing team that's just getting started might be to use a test automation framework, defined as a set of assumptions, concepts, and practices that constitute a work platform or support for automated testing. In this article I'll attempt to shed a little light on a handful of the test automation frameworks I'm familiar with -- specifically, test script modularity, test library architecture, keyword-driven/table-driven testing, data-driven testing, and hybrid test automation.
Types of Frameworks?
>> Test Script Modularity
>> Test Library Architecture
>> Data-Driven Testing
>> Keyword-Driven/table-Driven Testing
>> Hybrid Test Automation
1. Test Script Modularity
The test script modularity framework is the most basic of the frameworks. It's a well-known programming strategy to build an abstraction layer in front of a component to hide the component from the rest of the application. This insulates the application from modifications in the component and provides modularity in the application design. When working with test scripts (in any language or proprietary environment) this can be achieved by creating small, independent scripts that represent modules, sections, and functions of the application-under-test. Then these small scripts are taken and combined them in a hierarchical fashion to construct larger tests. The use of this framework will yield a higher degree of modularization and add to the overall maintainability of the test scripts.
2. Test Library Architecture
The test library architecture framework is very similar to the test script modularity framework and offers the same advantages, but it divides the application-under-test into procedures and functions (or objects and methods depending on the implementation language) instead of scripts. This framework requires the creation of library files (SQABasic libraries, APIs, DLLs, and such) that represent modules, sections, and functions of the application-under-test. These library files are then called directly from the test case script. Much like script modularization this framework also yields a high degree of modularization and adds to the overall maintainability of the tests.
3. Data-Driven Testing
A data-driven framework is where test input and output values are read from data files (ODBC sources, CVS files, Excel files, DAO objects, ADO objects, and such) and are loaded into variables in captured or manually coded scripts. In this framework, variables are used for both input values and output verification values. Navigation through the program, reading of the data files, and logging of test status and information are all coded in the test script. This is similar to table-driven testing (which is discussed shortly) in that the test case is contained in the data file and not in the script; the script is just a "driver," or delivery mechanism, for the data. In data-driven testing, only test data is contained in the data files.

i) Merits of data-driven testing :: The merits of the Data-Driven test automation framework are as follows:
> Scripts may be developed while application development is still in progress
> Utilizing a modular design, and using files or records to both input and verify data, reduces redundancy and duplication of effort in creating automated test scripts
> If functionality changes, only the specific "Business Function" script needs to be updated Data input/output and expected results are stored as easily maintainable text records.
> Functions return "TRUE" or "FALSE" values to the calling script, rather than aborting, allowing for more effective error handling, and increasing the robustness of the test scripts. This, along with a well-designed "recovery" routine, enables "unattended" execution of test scripts.

ii) Demerits of data-driven testing :: The demerits of the Data-Driven test automation framework are as follows:
> Requires proficiency in the Scripting language used by the tool (technical personnel)
> Multiple data-files are required for each Test Case. There may be any number of data-inputs and verifications required, depending on how many different screens are accessed. This usually requires data-files to be kept in separate directories by Test Case
> Tester must not only maintain the Detail Test Plan with specific data, but must also re-enter this data in the various required data-files
> If a simple "text editor" such as Notepad is used to create and maintain the data-files, careful attention must be paid to the format required by the scripts/functions that process the files, or script-processing errors will occur due to data-file format and/or content being incorrect
4. Keyword-Driven/table-Driven Testing
This requires the development of data tables and keywords, independent of the test automation tool used to execute them and the test script code that "drives" the application-under-test and the data. Keyword-driven tests look very similar to manual test cases. In a keyword-driven test, the functionality of the application-under-test is documented in a table as well as in step-by-step instructions for each test. In this method, the entire process is data-driven, including functionality.

Example ::
In order to open a window, the following table is devised, and it can be used for any other application, just it requires just changing the window name.
Window               | Control         | Action | Arguments
Window Name | Menu             | Click     | File, Open
Window Name | Menu             | Click     | Close
Window Name | Pushbutton | Click     | Folder Name
Window Name |                          | Verify  | Results

Once creating the test tables, a driver script or a set of scripts is written that reads in each step executes the step based on the keyword contained the Action field, performs error checking, and logs any relevant information.

i) Merits of keyword driven testing :: The merits of the Keyword Driven Testing are as follows:
> The Detail Test Plan can be written in Spreadsheet format containing all input and verification data.
> If "utility" scripts can be created by someone proficient in the automated tool's Scripting language prior to the Detail Test Plan being written, then the tester can use the Automated Test Tool immediately via the "spreadsheet-input" method, without needing to learn the Scripting language.
> The tester need only learn the "Key Words" required, and the specific format to use within the Test Plan. This allows the tester to be productive with the test tool very quickly, and allows more extensive training in the test tool to be scheduled at a more convenient time.

ii) Demerits of keyword driven testing The demerits of the Keyword Driven Testing are as follows:
> Development of "customized" (Application-Specific) Functions and Utilities requires proficiency in the tool's Scripting language. (Note that this is also true for any method)
> If application requires more than a few "customized" Utilities, this will require the tester to learn a number of "Key Words" and special formats. This can be time-consuming, and may have an initial impact on Test Plan Development. Once the testers get used to this, however, the time required to produce a test case is greatly improved.
5. Hybrid Test Automation
The most commonly implemented framework is a combination of all of the above techniques, pulling from their strengths and trying to mitigate their weaknesses. This hybrid test automation framework is what most frameworks evolve into over time and multiple projects. The most successful automation frameworks generally accommodate both Keyword-Driven testing as well as Data-Driven scripts.

This allows data driven scripts to take advantage of the powerful libraries and utilities that usually accompany a keyword driven architecture. The framework utilities can make the data driven scripts more compact and less prone to failure than they otherwise would have been.

The utilities can also facilitate the gradual and manageable conversion of existing scripts to keyword driven equivalents when and where that appears desirable. On the other hand, the framework can use scripts to perform some tasks that might be too difficult to re-implement in a pure keyword driven approach, or where the keyword driven capabilities are not yet in place. The following sections describe its architecture, merits and demerits.