I have a @Test
setup on a script, that runs with some soft asserts
.
However, I am running into a problem with the placement of the assertAll
. I want all of the URLs
to go through before the assertAll
. Is this possible or is another recommended approach?
@Test
public static void checkUrl(String requestUrl, String expectedUrl){
SoftAssert softAssert = new SoftAssert ();
try {
URL obj = new URL(requestUrl);
HttpURLConnection conn = (HttpURLConnection) obj.openConnection();
conn.setReadTimeout(5000);
conn.addRequestProperty("Accept-Language", "en-US,en;q=0.8");
conn.addRequestProperty("User-Agent", "Mozilla");
conn.addRequestProperty("Referer", "google.com");
System.out.println();
System.out.println("Request URL ... " + requestUrl);
boolean redirect = false;
// normally, 3xx is redirect
int status = conn.getResponseCode();
if (status != HttpURLConnection.HTTP_OK) {
if (status == HttpURLConnection.HTTP_MOVED_TEMP
|| status == HttpURLConnection.HTTP_MOVED_PERM
|| status == HttpURLConnection.HTTP_SEE_OTHER) redirect = true;
}
System.out.println("Response Code ... " + status);
if (redirect) {
// get redirect url from "location" header field
String redirectUrl = conn.getHeaderField("Location");
// get the cookie if need, for login
String cookies = conn.getHeaderField("Set-Cookie");
// open the new connnection again
conn = (HttpURLConnection) new URL(redirectUrl).openConnection();
conn.setRequestProperty("Cookie", cookies);
conn.addRequestProperty("Accept-Language", "en-US,en;q=0.8");
conn.addRequestProperty("User-Agent", "Mozilla");
conn.addRequestProperty("Referer", "google.com");
System.out.println("Redirect to URL : " + redirectUrl);
//Assert.assertEquals (redirectUrl, expectedUrl);
softAssert.assertEquals (redirectUrl, expectedUrl, "Expected URL does not match"
+ requestUrl);
} else {
//org.testng.Assert.assertTrue (redirect);
softAssert.assertTrue (redirect, "Please check the status for " + requestUrl);
System.out.println("** Please check status for " + requestUrl);
System.out.println("************************************************");
System.out.println("************************************************");
}
}
catch (Exception e) {
e.printStackTrace();
}
}
Soft asserts are just the opposite of hard asserts. In soft asserts, the subsequent assertions keep on running even though one assert validation fails, i.e., the test execution does not stop. Soft assert does not include by default in TestNG. For this, you need to include the package org.
Soft Assert collects errors during @Test is running . They don't throw an exception when an assert fails. The execution will continue with the next step after the assert statement.
In the case of the “Assert” command, as soon as the validation fails the execution of that particular test method is stopped. Following that the test method is marked as failed. Whereas, in the case of “Verify”, the test method continues execution even after the failure of an assertion statement.
The use case that you are looking for, kind of defeats the purpose of SoftAssert
. SoftAssert
was basically introduced in TestNG, so that you can gather all the assertions, throughout one @Test
method but fail the test method only at the end (when you invoke assertAll()
).
A data driven @Test
method is basically a @Test
method that runs "n" times (each iteration is running with a different set of data). So it doesn't make sense for you to try and leverage SoftAssert
and invoke its assertAll()
on the last iteration. Because if you do that, it would basically boil down to only the last iteration failing.
So if you are looking at re-running tests by using the testng-failed.xml
then it would contain only the index of the last iteration (Which is kind of absurd, because it wasn't the last iteration that actually failed).
So ideally speaking, you should make use of SoftAssert
only within the scope of a single iteration. That means you instantiate a SoftAssert
object within a @Test
method, invoke a bunch of assertXXX()
calls, and at the end of the method you invoke assertAll()
.
All said and done, if you are still looking for a sample that would show you how to do this, here's a sample.
First we define an interface that lets me set the size of the data provider provided data set as an attribute to the test class.
public interface IDataSet {
void setSize(int size);
}
The test class looks like below
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
import org.testng.annotations.TestInstance;
import org.testng.asserts.SoftAssert;
import java.util.concurrent.atomic.AtomicInteger;
public class SoftAssertDemo implements IDataSet {
private int size;
private SoftAssert assertion = new SoftAssert();
private AtomicInteger counter = new AtomicInteger(1);
@Override
public void setSize(int size) {
this.size = size;
}
@Test(dataProvider = "dp")
public void testMethod(int number) {
if ((number % 2) == 0) {
assertion.fail("Simulating a failure for " + number);
}
if (counter.getAndIncrement() == size) {
assertion.assertAll();
}
}
@DataProvider(name = "dp")
public Object[][] getData(@TestInstance Object object) {
Object[][] data = new Object[][] {{1}, {2}, {3}, {4}, {5}};
if (object instanceof IDataSet) {
((IDataSet) object).setSize(data.length);
}
return data;
}
}
Caveats in this approach:
@DataProvider
method in it because the data provider is passing on the size of the data set back to the test class instance. So if you have 2 or more data providers, there's a chance of data race going on, wherein one data provider overwrites the other one.@Test
methods that are powered by the data providers, dont run in parallel.If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With