Do and Don'ts for an Automation Engineer

Modified on Thu, 10 Oct at 7:17 PM

ü  Do’s  

× Don’ts  

Do understand the project requirements and perform requirement analysis and define the testing objectives by collaborating with the required stakeholders.

Don't skip understanding requirements; unclear goals lead to ineffective testing.

Ensure to create a comprehensive testing plan, outlining the scope, testing type and resources.

Do start testing early before the code is pushed to production . 
Do create test cases and test all the features and functionalities before the code is pushed to Staging and execution report is created. 

Don’t delay the testing process. Ensure it is completed thoroughly so that users do not identify defects after the features are released to production.


Avoid using absolute paths since they require frequent updates whenever there are changes in the structure or hierarchy of the UI elements within the application. 

Implement dynamic wait strategies in your tests to improve test execution. For example, 'wait for control visible' action instructs the system to wait until an element is visible before clicking it. Ensure it is only clicked when it is visible on the screen. algoQA includes default actions to support this process. For more information on Default actions, see Default actions. 

Avoid hard coded wait (fixed time intervals) in your actions, since it may lead to flaky tests.  
 
If the element does not become visible within the specified timeframe, it could result in test case failure. 

Do use appropriate naming conventions, such as snake_case for JSON configuration file keys, CamelCase for variables, and PascalCase for UI elements. 

Ensure while picking objects, you follow the Parent-child hierarchy while automating test cases using specific frameworks.  

Skipping this hierarchy makes it harder to locate and manage related objects within the application or codebase. 

Ensure to generate Test Data in a Separate YAML File by enabling Data Decoupling option in the Project Settings page.

Avoid storing test data in the Feature File: 

Makes it harder to update or manage test data. 

Limits the ability to reuse test data across multiple test cases or scenarios.

  

Make sure that test data is stored in configuration files. 


Avoid storing the test data in custom files. 


Use tags to execute scripts. 
 
Avoiding the usage of tags increases the execution time. 


Don't generate only one Feature File. Instead generate multiple Feature Files.



Clear cookies and cache before starting each test. 
Avoid unnecessary browser tabs or windows open during testing sessions which might delay the testing process.

Ensure to set the option in the Project Settings page to take auto backups of your project files daily, weekly and monthly.  
Alternatively, you can set this option to take manual back up by right-clicking on the appropriate project file name using More Menu option in the Project Explorer.

Avoiding taking backups may result in data loss and disrupt the workflow.

Get the testing execution report reviewed by the required stakeholders.
Create Bugs or enhancements after through testing.

Avoid skipping the review cycles.


Ensure to provide an appropriate name for custom actions and reuse the custom actions.


Avoid reusing custom actions results in script generation failure.

 

 

Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article