In the top of form1 i did:
WebClient Client;
Then in the constructor:
Client = new WebClient();
Client.DownloadFileCompleted += Client_DownloadFileCompleted;
Client.DownloadProgressChanged += Client_DownloadProgressChanged;
Then i have this method i'm calling every minute:
private void fileDownloadRadar()
{
if (Client.IsBusy == true)
{
Client.CancelAsync();
}
else
{
Client.DownloadProgressChanged += Client_DownloadProgressChanged;
Client.DownloadFileAsync(myUri, combinedTemp);
}
}
Every minutes it's downloading an image from a website same image each time. It was all working for more then 24 hours no problems untill now throwing this exception in the download completed event:
private void Client_DownloadFileCompleted(object sender, AsyncCompletedEventArgs e)
{
if (e.Error != null)
{
timer1.Stop();
span = new TimeSpan(0, (int)numericUpDown1.Value, 0);
label21.Text = span.ToString(@"mm\:ss");
timer3.Start();
}
else if (!e.Cancelled)
{
label19.ForeColor = Color.Green;
label19.Text = "חיבור האינטרנט והאתר תקינים";
label19.Visible = true;
timer3.Stop();
if (timer1.Enabled != true)
{
if (BeginDownload == true)
{
timer1.Start();
}
}
bool fileok = Bad_File_Testing(combinedTemp);
if (fileok == true)
{
File1 = new Bitmap(combinedTemp);
bool compared = ComparingImages(File1);
if (compared == false)
{
DirectoryInfo dir1 = new DirectoryInfo(sf);
FileInfo[] fi = dir1.GetFiles("*.gif");
last_file = fi[fi.Length - 1].FullName;
string lastFileNumber = last_file.Substring(82, 6);
int lastNumber = int.Parse(lastFileNumber);
lastNumber++;
string newFileName = string.Format("radar{0:D6}.gif", lastNumber);
identicalFilesComparison = File_Utility.File_Comparison(combinedTemp, last_file);
if (identicalFilesComparison == false)
{
string newfile = Path.Combine(sf, newFileName);
File.Copy(combinedTemp, newfile);
LastFileIsEmpty();
}
}
if (checkBox2.Checked)
{
simdownloads.SimulateDownloadRadar();
}
}
else
{
File.Delete(combinedTemp);
}
File1.Dispose();
}
}
Now it stopped inside the if(e.Error != null) On the line: timer1.Stop();
Then i see on the Error the error: This is the stack trace:
at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
at System.Net.WebClient.GetWebResponse(WebRequest request, IAsyncResult result)
at System.Net.WebClient.DownloadBitsResponseCallback(IAsyncResult result)
How can i solve this problem so it won't happen again ? And why it happened ?
EDIT:
I tried to change the fileDownloadRadar method to this to release the client every time:
private void fileDownloadRadar()
{
using (WebClient client = new WebClient())
{
if (client.IsBusy == true)
{
client.CancelAsync();
}
else
{
client.DownloadFileAsync(myUri, combinedTemp);
}
}
}
The problem is that in the constructor i'm using Client and here it's client two different Webclient variables.
How can i solve this and the exception ?
This is the websitel ink for the site with the image i'm downloading every minute. Still not sure yet why i got this exception after it was working no problems for more then 24 hours. Now i ran the program again over again and it's working but i wonder if i will get this exception again tommorow or sometimes in the next hours.
The site with image i'm downloading
I had the same problem with WebClient and found the solution here: http://blog.developers.ba/fixing-issue-httpclient-many-automatic-redirections-attempted/
Using HttpWebRequest and setting a CookieContainer solved the problem, for example:
HttpWebRequest webReq = (HttpWebRequest)HttpWebRequest.Create(linkUrl);
try
{
webReq.CookieContainer = new CookieContainer();
webReq.Method = "GET";
using (WebResponse response = webReq.GetResponse())
{
using (Stream stream = response.GetResponseStream())
{
StreamReader reader = new StreamReader(stream);
res = reader.ReadToEnd();
...
}
}
}
catch (Exception ex)
{
...
}
If you're getting an exception with a description that says there are too many redirections, it's because the Web site you're trying to access is redirecting to another site, which is directing to another, and another, etc. beyond the default redirections limit.
So, for example, you try to get an image from site A. Site A redirects you to site B. Site B redirects you to site C, etc.
WebClient
is configured to follow redirections up to some default limit. Since WebClient
is based on HttpWebRequest, it's likely that it is using the default value for MaximumAutomaticRedirections, which is 50.
Most likely, either there is a bug on the server and it's redirecting in a tight loop, or they got tired of you hitting the server for the same file once per minute and they're purposely redirecting you in a circle.
The only way to determine what's really happening is to change your program so that it doesn't automatically follow redirections. That way, you can examine the redirection URL returned by the Web site and determine what's really going on. If you want to do that, you'll need to use HttpWebRequest
rather than WebClient
.
Or, you could use something like wget with verbose logging turned on. That will show you what what the server is returning when you make a request.
Although this is an old topic, I couldn't help but notice that the poster was using WebClient which uses no UserAgent when making the request. Many sites will reject or redirect clients that don't have a proper UserAgent string.
Consider setting the WebClient.Headers["User-Agent"]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With