[Script] Download all the files in a directory from Goindex Websites

This script will download all the contents of Goindex websites in a specified directory . You just have to specify the download location where you want to download and give it the directory link of any Goindex website. Although i have written this script for linux based os but with little modification it can run on windows machine too , you just need a little bit knowledge of configuring selinium for windows.

Requirements

Python >3.5
selenium

Instructions/Installation/Usage

  1. Go to my colab script link

  2. Run the first cell (will mount google drive)

  3. Run second cell (will install all modules required to run this script)

  4. Left side you will see file icon click it and a (minimal) file explorer will open . Navigate to My Drive and right click on any folder in my drive to copy the path where you want to download the files from Goindex.

  5. Go to line number 15 or look for “destinationDir” and paste the download path you just copied on step 4 like this destinationDir = “/content/drive/My Drive/Saves”

  6. Then copy the Goindex directory link which you want to download go to line number 16 and assign the link like this
    url = ‘https://edu.tuts.workers.dev/Udacity%20-%20Collections%20[300%20GB]/Nanodegrees/Business%20Analytics%20Nanodegree%20v2.0.0/’

  7. Now just run the the third cell and it will download the data
    Bugs so far
    =

Not a bug really but it will download only to one level folder .

Note

I am currently working on another project . Since i had give my word that i will post a script so i posted it in hurry . You might encounter some bug so just comment . But i will only consider it when i have some time. Please don’t start bashing me it takes lot of hard work to write such scripts . Hope you understand

Disclaimer

Use it however you want

36 Likes

Weldone boss. :ok_hand: :pray: Thanks for sharing your script.

Bookmarked to be tried out later.

Cheers!!!

… And Over Every Possessor of Knowledge, There is (Some) One (Else) More Knowledgeable.

1 Like

well-done dude! The solution I made is to download to my local machine. Yours is, to google drive, absolutely the best!

By just modifying two three lines it can download on local machine too .

Share local machine downloading script

1 Like

Indeed. This banter tho. Good work buddy.

1 Like

I have just created the improved version of this script.

4 Likes

Try this for local downloading…it works i tried…heres the proof

nice info… udemy colab???

Is there any way i can access the udacity files without running the script and installing anything on the laptop ?

Just updated little:

  • Download queue supported
  • Auto-domain URL detection

Coming soon

  • TDQM multiple/paralelled downloader
  • Aria2 integration
4 Likes

You must be enrolled in any udacity course to download. And you also need to pass login credentials to the script to download on your own.

Answer to your question is straight forward no. But you might find the course on internet Just give it a try.

hint: change the edu.tuts.workers.dev → lol.freecoronavirus.workers.dev

1 Like

Thks bro …very nice script…my only req is to add forms so noobs know where and what exactly is the user input required…instead of users editing in the code itself and messing it up

2 Likes

This ^

For example, as common mortal, I have no idea what do to with these wonderful scripts you guys made.

2 Likes

Thanks great work

Version 2 is just added

Features

  • Recursive crawler (atlonxp)
  • Download all folders and files in a given url (atlonxp)
  • Download all folders and files in in sub-folders (atlonxp)
  • Adaptive delay in fetching url (atlonxp)
  • Store folders/files directly to your Google Drive (pankaj260)
  • Folders and files exclusion filters
  • Download queue supported
  • Auto-domain URL detection
  • API-based GoIndex crawler
  • Parallel/Multiple files downloader

You can download files faster :smiley:

9 Likes

Cool, you guys are really working good in this project. Thank you it will help many.

1 Like

The site mentioned to make use of this script has moved to other hosting provider. Making this script unusable.