top of page

Testing Cross Browser Compatibility and Generating Reports


1 . Introduction

We all would have come across situations where some websites are not properly displayed on some browsers, and we assume that the website is not working. But, as soon as we open it on a different browser, the website opens as expected. Thus, this behavior explains the compatibility of a website with different browsers.

Users have many different devices and browsers to choose from today. With such a wide variety of tablets, cell phones, and an increasing number of legacy browsers, software teams are challenged to make sure their products work well across different browsers and devices.

Each browser interprets the information on the website page differently. Thus, some browsers may lack the features that your website is trying to display and make your website appear down.

We, as QA automation testers will test various websites on daily basis. It’s essential to have a framework that can handle different browsers and versions, and using WebDriver allows us to switch between different browser drivers at runtime without creating multiple driver objects.

So, it is very important for us to check whether the website is working as expected in different browsers, called cross-browser testing. And while testing websites, instead of running the tests in sequential order, if we run multiple test simultaneously, it saves us a lot of time and efforts. This feature is called, Parallel Testing.

2. What is Cross Browser Testing?

Cross browser testing is a form of non-functional testing where the product (website or app) is tested across different browser and platform combinations.

Cross-browser testing means testing the application across multiple browsers like IE, Chrome, Firefox so that we can test our application effectively. It is the process of verifying your application’s compatibility with different browsers.

3. What should be Tested in Cross-Browser Testing?

In general, you’re have to check for two things in cross browser testing:

  1. The appearance of the page on the browser.

  2. The functionality of the page on the browser.

Determining which devices and browsers to test requires a bit of digging.

Ask yourself the following questions…

  • Where are people using your application?

  • Are they using it on their mobile device or are they only using it on desktops?

  • What requirements has the product manager defined around supported devices/browsers?

4.How to do Cross Browser Testing?

To perform Cross Browser testing in Run time, we can create a reference variable of the parent WebDriver interface and assign it to the Required Browser class object as below

WebDriver driver = new ChromeDriver();

If we want to test for the Firefox browser then we can use the same WebDriver driver reference and assign it to FirefoxDriver class object as below.

driver = new FirefoxDriver();

WebDriver driver instance has access to all the methods of WebDriver. This happens because every Browser Driver class extends RemoteWebDriver class and RemoteWebDriver implements WebDriver interface.

Here, I have used the latest Selenium 4.8.1 version for this code (Selenium Manager changes available from 4.6.0 version onwards).

  • System.setproperty() is no longer needed for setting up browser driver exe file

  • WebDriverManager is no longer required to setup driver binaries

To execute test cases with different browsers in the same machine at the same time we can integrate TestNG framework with Selenium WebDriver.

5. Example for Cross Browser Testing

This example has three files created as a Maven Project.


ii) Pom.xml

iii) Testng.xml

Note: We can use either if else block or switch case to select different browsers

package BrowserTypes;
import java.util.concurrent.TimeUnit;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.edge.EdgeDriver;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.testng.annotations.AfterTest;
import org.testng.annotations.BeforeTest;
import org.testng.annotations.Parameters;
import org.testng.annotations.Test;

public class CrossBrowserScript {
 WebDriver driver;
 public void setup(String browser) throws Exception{
  System.out.println("Browser Name is :" + browser);
/*  if(browser.equalsIgnoreCase("firefox")){
   driver = new FirefoxDriver();
  else if(browser.equalsIgnoreCase("chrome")){
    driver = new ChromeDriver();
  else if(browser.equalsIgnoreCase("Edge")){
   driver = new EdgeDriver();
   throw new Exception("Browser is not correct");
  switch(browser.toUpperCase()) {
  case "FIREFOX":
   driver=new FirefoxDriver();
  case "CHROME":
   driver = new ChromeDriver();
  case "EDGE":
   driver=new EdgeDriver();
   System.out.println("Browser is not correct");
  driver.manage().timeouts().implicitlyWait(10, TimeUnit.SECONDS);
 public void testParameterWithXML() throws InterruptedException{
  WebElement userName = driver.findElement("uid"));
  WebElement password = driver.findElement("password"));
  System.out.println("Test completed Successfully");
 public void browserQuit() {


In this file, we add all the necessary plugins and dependencies

<project xmlns="" xmlns:xsi="" xsi:schemaLocation="">


<!-- -->
<!-- -->
<!-- -->

<!-- -->



Since we are using TestNG Parameters in, we need to specify the values from the TestNG.XML file that will pass to the test case file.

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "">
<suite name="Suite" thread-count="3">
<test name="ChromeTest">
<parameter name="browser" value="Chrome" />
<class name="BrowserTypes.CrossBrowserScript">
  <test name="FirefoxTest">
<parameter name="browser" value="Firefox" />
<class name="BrowserTypes.CrossBrowserScript">
<test name="EdgeTest">
<parameter name="browser" value="Edge" />
<class name="BrowserTypes.CrossBrowserScript">
</suite> <!-- Suite -->

Here because the testing.xml has three Test tags (‘ChromeTest’,’FirefoxTest’,’EdgeTest’),this test case will execute three times for 3 different browsers.

Note :To run the test, Right click on the testing.xml, Select Run As, and Click TestNG

Console Output:

Results of running suite:

6. How To Generate Emailable Report In TestNG?

  • Emailable reports are generated in TestNG to let the user send their test reports to other team members. Emailable-reports do not require any extra work from the tester, and they are a part of overall test execution.

  • To generate emailable reports, first, run the TestNG test class if you have not done already .

  • Once we have run the test case, a new folder is generated in the same directory with the name test-output. If it’s not appearing, please refresh the project folder.

If you select the emailable-report.html, you can see the below screen:

7. How To Generate Index File In TestNG?

Emailable reports are a type of summary reports that one can transfer to other people in the team through any medium.

Index reports, on the other hand, contains the index-like structure of different parts of the report, such as failed tests, test file, passed test, etc.

To open the index.html file, locate it in the same test-output folder.

The below index report contains two parts. The left part contains the index, and this is the reason it is called an index report, while the right part contains the explored content of that index.

8. Cross Browser Testing Tools:

Cross browser testing comes with the challenge of needing to test on physical devices/browsers – many of which you may not have direct access to. Lucky for us, there are a number of cross browser testing tools that allow you to view your application on different browsers and devices.

Below are some of the most commonly used Cross-Browser Testing Tools:

  • Browsershots – A free service that provides screenshots of how a page looks on a variety of browsers.

  • Browserstack – A leader in the industry, Browserstack offers the ability to test on real machines by inputting the URL of the page/application you’re testing.

  • Browserling – A simple tool with over 80 browsers to test on.

  • Browsera – Browsers tests differences in layouts across browsers and reports scripting errors.

  • CrossBrowserTesting – Another leader in the space, CrossBrowserTesting has a large number of browsers to choose from, and you can even run your selenium scripts to automate actions.

  • Litmus – Allows for basic screen-shots in different browsers, but mainly focuses on providing cross client testing for emails.

9. Conclusion

Cross browser testing helps us to provide the best experience possible to all of your users. While it’s often one of the biggest pain points for development and QA teams, it is worth spending the time to make sure you’re covering all major browsers you support. Thus, I hope you got an overall understanding of Cross Browser Testing from this blog. Let us discuss about Parallel Testing in my other blog "how-to-do-parallel-testing-using-testng-and-reduce-run-time-drastically-in-selenium"

Happy Testing!

275 views1 comment

Recent Posts

See All
bottom of page