algoScraper as an Extension Tool for Chrome or Edge Users

Modified on Mon, 18 Nov at 5:42 PM

TABLE OF CONTENTS

1. Overview

2. Prerequisite 

3. Understanding Scraper Tool Window

3.1 Tool Functionality Summary

3.2 Scraping the Application Using algoScraper Tool

3. Related Articles

1. Overview

The algoScraper tool, available as a Chrome extension, serves the purpose of capturing all User Interface (UI) elements and XPaths present on a web page.


Note: You can access algoScraper directly from the browser bar in Chrome. However, if you face any issues while using this extension, you can also find algoScraper within the Developer Tools Panel in the Chrome. To access Developer Tools Panel, right-click on the application to select Inspect > Properties >algoScraper.For more information on usage of this extension, refer to Using algoScraper in the Developer Tools Panel.


2. Prerequisite 

To integrate this tool into your Chrome or Edge browser, Click here.  Select the  icon corresponding to the extension to view it on your browser bar, as shown:


3. Understanding Scraper Tool Window

Click the extension to view the following window


Select the page you wish to scrape  and click Scrape UI button to view the following window    

3.1 Tool Functionality Summary

FieldDescription
Captures and displays the URL of Page name.
Click this button to scrape all the UI elements at once displayed on the particular page.
While using scrape UI button ensure to select the control type and provide a meaningful name without additional spaces. If you don't select the control Type, following error message is displayed:

After scraping all the UI elements, you can search for a particular UI element using search bar. If you use invalid values in the search bar, you will see the error message shown in the secondary window of the left pane.
Ensure that your search criteria are accurate and relevant to the data you are trying to retrieve.
Reset window secondary window
Click this button to reset all the scraped UI elements listed in the table.

Click this button to access more actions menu.
Edit URL in the secondary windowRe-scrape
Re-scraping involves re-extracting data from a web page. It is performed when changes occur on the web page or application under test. You can upload the updated re-scraped file while profiling the application and generate test cases.

Choose this option to access the upload section and then perform the following actions:
  • Upload the previously downloaded CSV file that contains the details of the scraped elements.
  • Re-scrape the elements using this option and capture it in the same CSV file.
  • Download the updated CSV file after re-scraping the elements.
  • Upload the updated CSV file 

Settings - choose this option to view sub options:
  • Customize Table -Select this option if you wish to select or unselect the listed items in a table.
  • Customize Control Type -Select this option if you wish to scrape particular category of UI element. This option is applicable only when using the 'Scrape UI' option. For example, if a webpage contains various types of UI elements and you wish to scrape only input fields, then you might use the Customize Control Type option.

Use Back button to go back to Main window.  

Use Refresh button to refresh the customized settings.

Edit URL - choose this option to change the application URL if needed.

This is the only method available to modify the URL.

Hover over a specific XPath to verify it. This action highlights the corresponding UI element and displays the XPath count.
Select this button to download the scraped UI elements table in a CSV format.
This helps in monitoring the metrices, such as Response status code, Response time and load time along with the given URL, date and time. This helps in finding the fixes and issues in a given application. By keeping track of these metrics, you can make sure your web services are performing well. For more information, refer to PingLink article.
You can set up multiple XPath expressions for each UI element in the web scraper tool. If one XPath fails, the tool will automatically try another, reducing the need for manual updates to the scraped CSV file.
If you are unable to scrape the XPaths of the UI elements in certain application, then you can use the screenShot functionality to scrape the XPaths of the UI elements in the Scraper window.
Select this button to add a new row to a scraped element table.
Select this button if you wish to delete a particular row in a scraped UI element table. 


Note: algoScraper supports the extraction of XPaths for text containing special characters from websites. For example, if you are ordering a cheese pizza with Jalapeño toppings from a pizza website. The word 'Jalapeño' contains a special character. After downloading the scraped file, ensure to manually update the XPaths if they have changed. Test cases are automatically generated to handle such scenarios, and you can access detailed reports after executing the scripts.

3.2 Scraping the Application Using algoScraper Tool

Perform the following:

  1. Open the application and if you wish to scrape the entire page at one go, click the extension on your application described in the prerequisite to view the algoScraper window and click Scrape UI button. All the UI element details with corresponding XPath expression of the scraped UI page details will be listed. Upon successful completion, you will see the Scraping completed message.
  2. If you wish to scrape individual UI element, perform the following: 
    1. Navigate to a particular UI element, right-click to capture the XPath expression of that UI element. The captured details are added to the scraped element table list.

    2. Common XPath refers to frequently used XPath expressions that are used to locate elements.
      If you wish to capture Common XPath expression, perform the following:
      Navigate to the particular category on a webpage, right-click the category to view the following screen which displays the Common XPath.


      Usage of common XPath expression:
      Common XPaths can be used across different web pages or applications to locate similar types of elements, such as buttons, input fields, or links.

    3. After scraping UI elements, download the file in the CSV format. Upon successful download, a "File downloaded successfully" message is displayed, and click  the icon to close the Scraper window.

  3. If you wish to scrape UI elements in a pop-up window, perform the following:
    1.  In the pop-up window, navigate to a particular UI element. 
    2.  Right-click on the UI element to capture its XPath expression. 
    3. The captured details are added to the scraped elements table list, as shown:









Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article