-
Notifications
You must be signed in to change notification settings - Fork 11
Reports
Automated tests need good reporting so that the test results are meaningful. For this purpose, we use the well-known Allure framework. We have a short guide on how to set up Allure. To get even more insights into your test results we enriched the reports by additional data. This article is about the additional features we have added and how to use them.
Sometimes it is necessary to know the exact link a step in the automation opened and sometimes it's just convenient to jump into a flow at a certain point. For this we have created a function which adds the currently displayed link as a step to the Allure report.
As soon as it is activated, every time a new page is called, a link to exactly this page is stored in the report. This way you can see if the link is correct and even open the page to have a look at it.
This is activated per default but can be deactivated via the neodymium.report.enableStepLinks
property in the Neodymium configuration.
When looking into a failed testcase, it is necessary to get the corresponding test data to be able to replicate the issue. To not be forced to look through a bunch of code just to find the used test data, we have implemented the display of test data in JSON format for each test in the Allure report.
This is activated by default but can be deactivated if the neodymium.report.enableTestDataInReport
property in the Neodymium configuration is set to false
.
While activated, this feature adds the test data as an attachment with the name always starting with "Testdata" to the Allure report. It is important to mention, that it only applies to test data that is initialized with DataUtils
or DataItem
, both belonging to Neodymium.
If the test data changes during the test run, it is also possible to add the changes as an additional attachment to the Allure report. To accomplish that, we implemented a helper function
addDataAsJsonToReport(String name, Object data)
in AllureAddons
, belonging to Neodymium. The parameter name
is the name of the attachment and the parameter data
is the changed testdata.
If you compare different json data and the comparison fails, you either get a huge and confusing error message or a short meaningless one, where you don't see what exactly went wrong. In order to prevent wasting time on looking through those kind of error messages, we have implemented the class JsonAssert
, for what we utilized the already existing JSONAssert.
If assertEquals
fails, an attachment named "Json Compare" containing the differences of the json data is added to the Allure report. If assertNotEquals
fails, an attachment named "Json View" containing the json data once is added to the Allure report.
Accessibility reports are crucial when testing web pages because they help to ensure that a site is usable for people of all abilities. That is why we introduced Google Lighthouse to Neodymium, an open-source tool to improve the quality of web pages.
First of all we recommend installing a package manager like npm, which we are also going to use in order to install Lighthouse CLI. After you went through the installation process of npm, open a terminal and enter the command below.
npm install -g lighthouse
To make sure the Lighthouse installation was successful, you can run the following command.
lighthouse --version
With the objective of creating Lighthouse reports inside Neodymium we implemented the class LighthouseUtils
, containing the function createLightHouseReport(WebDriver driver, String URL, String reportName)
. By calling this function and adding the required parameters, a Lighthouse report of the current web page is generated and automatically added to the Allure report with the name specified in the reportName
parameter. Keep in mind that creating a Lighthouse report only works while using Chrome or Chromium-based browsers.
A Lighthouse report consists of the following four categories.
- Performance
- Accessibility
- Best Practices
- Search Engine Optimisation (SEO)
Each of those categories get a score between 1 and 100, which reveals how well the web page performed in every category. Therefor Google defines the following ranges.
- 0 to 49: Poor
- 50 to 89: Needs Improvement
- 90 to 100: Good
To enable validating the scores of all four categories, we implemented the following score thresholds in the neodymium.properties
file.
neodymium.lighthouse.performance
neodymium.lighthouse.accessiblity
neodymium.lighthouse.bestPractices
neodymium.lighthouse.seo
All of those configuration properties are set to 0.5 per default, which sets the score threshold of all categories to 50. That means each and every category needs to match or exceed a score of 50 for the test not to fail. All of the score thresholds can be changed depending on the user's wishes.
Overview
Neodymium features
- Neodymium configuration properties
- Neodymium context
- Utility classes
- Test data provider
- Test Environments
- Multi browser support
- Applitools Plugin
- Localization
- Highlight and Wait
- Advanced Screenshots
- Seperate Browser Sessions for Setup and Cleanup
- Reporting
- Accessibility Testing
Best practices and used frameworks
Special