Im trying to create a web service which gets to a URL e.g. www.domain.co.uk/prices.csv
and then reads the csv file. Is this possible and how? Ideally without downloading the csv file?
Fortunately, there's an easy trick with the read. csv() procedure which can be used to import data from the web into a data frame R object file format. Simple take the header argument URL and feed it into read. csv().
There is no simple solution to export a website to a CSV file. The only way to achieve this is by using a web scraping setup and some automation. A web crawling setup will have to be programmed to visit the source websites, fetch the required data from the sites and save it to a dump file.
You could use:
public string GetCSV(string url) { HttpWebRequest req = (HttpWebRequest)WebRequest.Create(url); HttpWebResponse resp = (HttpWebResponse)req.GetResponse(); StreamReader sr = new StreamReader(resp.GetResponseStream()); string results = sr.ReadToEnd(); sr.Close(); return results; }
And then to split it:
public static void SplitCSV() { List<string> splitted = new List<string>(); string fileList = getCSV("http://www.google.com"); string[] tempStr; tempStr = fileList.Split(','); foreach (string item in tempStr) { if (!string.IsNullOrWhiteSpace(item)) { splitted.Add(item); } } }
Though there are plenty of CSV parsers out there and i would advise against rolling your own. FileHelpers is a good one.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With