Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Download files by chunks in multiple threads in Go

I need to download files, chunk by chunk in multiple threads. For example, I have 1k files, each file ~100Mb-1Gb and I can download these files only by chunks 4096Kb(each http get request gives me only 4kb).

It might be to long to download it in one thread, so I want to download them, let's say in 20 threads(one thread for one file) and I also need to download a few chunks in each of these threads, simultaneously.

Is there any example that shows such logic?

like image 800
Bobby Avatar asked Aug 02 '17 23:08

Bobby


1 Answers

This is an example of how to set up a concurrent downloader. Things to be aware of are bandwidth, memory, and disk space. You can kill your bandwidth by trying to do to much at once, the same goes for memory. Your downloading pretty big files so memory can be an issue. Another thing to note is that by using gorountines you are losing request order. So if the order of the returned bytes matter, then this will not work because you will have to know the byte order to assemble the file in the end, which would mean that a downloading one at a time is best, unless you implement a way to keep track of the order (maybe some kind of global map[order int][]bytes with mutex to prevent race conditions). An alternative that doesn't involve Go (assuming you have a unix machine for ease) is to use Curl see here http://osxdaily.com/2014/02/13/download-with-curl/

package main

import (
    "bytes"
    "fmt"
    "io"
    "io/ioutil"
    "log"
    "net/http"
    "sync"
)

// now your going to have to be careful because you can potentially run out of memory downloading to many files at once..
// however here is an example that can be modded
func downloader(wg *sync.WaitGroup, sema chan struct{}, fileNum int, URL string) {
    sema <- struct{}{}
    defer func() {
        <-sema
        wg.Done()
    }()

    client := &http.Client{Timeout: 10}
    res, err := client.Get(URL)
    if err != nil {
        log.Fatal(err)
    }
    defer res.Body.Close()
    var buf bytes.Buffer
    // I'm copying to a buffer before writing it to file
    // I could also just use IO copy to write it to the file
    // directly and save memory by dumping to the disk directly.
    io.Copy(&buf, res.Body)
    // write the bytes to file
    ioutil.WriteFile(fmt.Sprintf("file%d.txt", fileNum), buf.Bytes(), 0644)
    return
}

func main() {
    links := []string{
        "url1",
        "url2", // etc...
    }
    var wg sync.WaitGroup
    // limit to four downloads at a time, this is called a semaphore
    limiter := make(chan struct{}, 4)
    for i, link := range links {
        wg.Add(1)
        go downloader(&wg, limiter, i, link)
    }
    wg.Wait()

}
like image 127
reticentroot Avatar answered Nov 09 '22 16:11

reticentroot