How to download all the files in a directory from Goindex Websites?

There are a lot of websites not publishing direct google drive links but providing Goindex links like this one-[300%20GB]/
Does anybody know how to either get a direct google drive link or download all the files present in one directory?

[For those who don’t know about Goindex, it is like publishing a (sort of) directory wesbite like above directly from google drive, even without providing direct google drive links to users.
For more information about Goindex, check -]


Try wget or wget2 it will work
or the simple thg would be jus asking for gdrive link to the owner of index
for wget setup refer…I personally refered this guide to setup my wget

& incase ur not familiar with cmd

Its working…here’s the proof…its downloading along with folders

1 Like

I have written a script which downloads all the content to your Google drive or on your local machine.

so …where can we get it?

Off course i will post it with some modifications,just give me some time

1 Like
You can read by

Can you please share that script?

so …where can we get it? wats the point of boasting of script if ur not gonna give it to us :rofl:

1 Like

dude if i knew how to code why would i ask for script?
u posted the dancing emoji on my comment initially thats why i thought u were trolling me
we can see both the scripts wats the issue…haha
and i m not sure wat do u guys mean by polished
It jus has to work right…we r not asking for any fancy gui interface…a working cli with usage instructions will do jus fine


Hey every one i just created a new topic which contains the script to download Goindex contents . Waiting approval of my post. Stay tuned.

1 Like

See i have done what i said . I am man of my word . Had i not been able to post i would have publically apologized to you guys . But wait for my post to get approved.

1 Like

here is the overview instructions! coding is not that difficult.

  1. python
  2. use beautifulsoup4 + selenium (Goindex is javascript based. bs4 itself can’t grab folders and files link)
  3. find all folders and files (ul id=“list” --> li class="{see html from inspection}"
  4. stored all links (probably save in json file)
  5. iterate all links then use aria2 to download
  6. voila!

Script below.


Your a man of your word bro :sunglasses:

Thanks a lot man for the script.