Colab runtime error /colab frequently disable

Hello guys,
i require yr help or suggestion , i am facing a problem when i am using colab.and try to tranfer my important data from mega to gdrive.
when i connected gdrive to colab and data is tranfering from mega to gdrive smoothly but after few seconds ,colab is disconnected and runtime is disabled.
PFA.




yup , it happened to me . i too doing same tranferrring my data from mega to gdrive

limits are reached because of excessive usage . from past 5 days it is doing same to me

Use this → colab console script.txt (222 Bytes)

Run this script to make sure your session stays active. Press F12 and go to console and copy paste and enter. It might output some random number. Don’t worry about that.

You may try using colab of different google account than the account in which you want to transfer. Means colab is running under account1 and mount drive of account2. In this way your runtime will not get disconnected.

check dm I have suggested changes as what you are doing using a pre made script is not efficient at all

Posting this info here if someone from future ends up having the same problem
start with a new notebook and

!wget https://beta.rclone.org/branch/fix-dropbox-batch-sync/v1.55.0-beta.5334.ac7acb9f0.fix-dropbox-batch-sync/rclone-v1.55.0-beta.5334.ac7acb9f0.fix-dropbox-batch-sync-linux-amd64.deb

!apt install ./rclone-v1.55.0-beta.5334.ac7acb9f0.fix-dropbox-batch-sync-linux-amd64.deb

This will install rclone from a github branch in your colab instance

then use this

!rclone config

run this and config both your google drive and mega account, look it up if you face any difficulty but I guess you will be able to do this much

Then the following below will do a test run of what you are about to transfer to google drive, note that no real transfer will take place with this step, but it will do calculations and stuff and output what will be transferred if you ran the command for real, this command is just to test the configs

!rclone copy “mega:stuff/” “gdrive:stuff/” --update --fast-list --dry-run

Here in the above command I am assuming you named your remotes mega and gdrive corresponding to the clouds and folder you are transferring is called stuff, if you are copying everything in mega then just erase what is after : in the above command

After it gives a satisfactory output run the below command

!rclone copy mega:stuff/ gdrive:stuff/ --update --verbose --fast-list --transfers=100 --checkers=8

and transfers should start at a good overall speed

Limitations: Mega’s per file transfer speed via rclone is less so this is useful only if there are large number of files to transfer, say a lot of text files and small video files or images,
limitation number 2: google drive can commit only 2 uploads a second which is not that bad but if you have super high number of files then that will be a problem

Speaking from personal experience I transferred 1tb of stuff and it had around 65.5k files with variable sizes, it took 6-7hrs but it was super smooth at least

experimental tip:

you can use the args if the files you are transferring are gumoungous in number but I dont guarantee as they might cause more trouble than good

–tpslimit=60 -vv --drive-pacer-min-sleep=60ms

2 Likes

wont work i tried

what r asking bro , “torrent to gdrive” what is the meaning

colab code

bro, mega to gdrive…
while transferring data, runtime suddenly disconnected.

If anyone of you got any doubt do ask

thank buddy for yr help…
i configure the rclone according to your instruction…but it take too much time , it take up to 5 hours for 100 GB file transfer from mega to gdrive. and is still in processing…i have 50 TB data in mega drive to transfer , only 15 days are left for expiration…any suggestion…

If that is the case,
Problem is because of the fact that only a fraction of mega’s power can be used by rclone thanks to an undisclosed api

best option will be to use rdp with very fast internet access

you will have to use megacmd and download as much as you can fit on the rdp and then use rclone to move it to any other cloud with needed args

You will have to tell me the type of files which are trying to transfer, like are there huge files or are there a humongous volume of small files like txt files/images

1
when i am using colab (transferring data from mega 2 gdrive) …im facing that runtime error continuously as shown in attached pic…any solution

come on Dm

mostly are movies and courses videos…
U r right, i will goto RDP option…if my requirement are not fulfil.