30
Dec

Common pitfalls for automation frameworks

While designing “Automation frameworks” following are some of pitfalls to avoid to keep “Low maintenance” of scripts:

  • Avoid test scripts with local test data

Generally seen in module driven automation frameworks, where test data (Data with which we are supposed           to perform testing) is embedded into automation scripts. Thus, whenever we are supposed to test with a                     different / multiple set of test data, script needs maintenance.

  • Test script dependent for specific O.S. / Browser / DB / Device

Having scripts limited to run for specific browser / OS / DB / Device needs to be designed and flexible to                   detect test environment and complete tests for supported environments.

  • Test scripts designed to run sequentially or specific order

Scripts designed to run sequentially in specific order invites pain points to re-run scripts for end to end                     coverage provided there is failure in between due to network / run-time conditions etc.

  • Framework designs not enhancing time to test

With growing numbers of scenarios, environments it becomes cumbersome to complete regression in time                and organizations need to depend on run results to complete.  Framework needs to be designed for parallel              execution across systems thereby load balancing tests and improving time to test.

  • Designed for flexibility

With frequent changes in features / test needs to extend all capabilities of framework without the need to redesign framework.  Frameworks needs to support such modularity thereby maintaining testing standards.

  • Designed for Run logs / custom reports 

Frameworks needs to be designed to capture run logs while tests are executed and prepare custom reports to provide address statistical details on coverage and help decide quality trends.