Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can Powershell be used to list the contents of a URL directory?

We have a bunch of zip files hosted on an FTP server, which are also accessible via HTTP. I would love to do something like (gci http://test.com/test/ *.zip ) and give me all the zip files which exist on the webserver.

Does anyone know of a way to do such a thing in a clean way?

TIA

like image 270
mumbles Avatar asked Sep 16 '25 09:09

mumbles


1 Answers

this is quite easy with invoke-webrequest (PS V3)

$r=iwr http://asite.com/test2/ -UseBasicParsing  
$r.Links |?{$_.href -match ".zip"} 

of course as +arco444 states, the directory index must be enable directory listing


Edit To get the last modified file, you will have to parse the HTML, here is an example (the regex will have to be addapted to your config) :

$col=@()
$link,$date,$size=""
$r=iwr http://asite.com/test2/ 
$r.ParsedHtml.body.getElementsByTagName('TR')|%{ 
    $_.getElementsByTagName('TD') |select -expand innerHTML |%{     
        switch -regex ($_){
            "(.)*zip"{ $link = $_;break}
            "\d{2}-...-\d{4}(.)*"{$date=$_;break}
              "^\d*[KM]"    {$size=$_;break }       
            default{}
        }

    }
        if( $link -and $date -and $size){
        $o=new-object -typename psobject |select  -property "link","date","size"
        $o.link=$link
        $o.date=$date
        $o.size=$size

        $col+=$o
        }
    } 
    $col |select -unique "link","date","size" |sort -desc date |select -last 1
like image 67
Loïc MICHEL Avatar answered Sep 19 '25 06:09

Loïc MICHEL