I have a function in C# that fetches the status of Internet by retrieving a 64b XML from the router page
public bool isOn()
{
HttpWebRequest hwebRequest = (HttpWebRequest)WebRequest.Create("http://" + this.routerIp + "/top_conn.xml");
hwebRequest.Timeout = 500;
HttpWebResponse hWebResponse = (HttpWebResponse)hwebRequest.GetResponse();
XmlTextReader oXmlReader = new XmlTextReader(hWebResponse.GetResponseStream());
string value;
while (oXmlReader.Read())
{
value = oXmlReader.Value;
if (value.Trim() != ""){
return !value.Substring(value.IndexOf("=") + 1, 1).Equals("0");
}
}
return false;
}
using Mozilla Firefox 3.5 & FireBug addon i guessed it normally takes 30ms to retrieve the page however at the very huge 500ms limit it stills reach it often. How can I dramatically improve the performance?
Thanks in advance
You're not closing the web response. If you've issued requests to the same server and not closed those responses, that's the problem.
Stick the response in a using
statement:
public bool IsOn()
{
HttpWebRequest request = (HttpWebRequest) WebRequest.Create
("http://" + this.routerIp + "/top_conn.xml");
request.Timeout = 500;
using (HttpWebResponse response = (HttpWebResponse) request.GetResponse())
using (XmlReader reader = XmlReader.Create(response.GetResponseStream()))
{
while (reader.Read())
{
string value = reader.Value;
if (value.Trim() != "")
{
return value.Substring(value.IndexOf("=") + 1, 1) != "0";
}
}
}
return false;
}
(I've made a few other alterations at the same time...)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With