[SOLVED] How to mirror website with SSO/Oauth only (github/facebook/google login)?

Created a trial account for https://www.interviewcake.com and I want to copy content for offline view and for future use, tried using HTTrack and WebCopy but these tools just copy the entire site without login information.
How to copy websites which have SSO(single sign-on) implementation only?
Which configuration should I do to solve this issue?
Please help me with this.
Thank you in advance! :grin:


I don’t have solution but share with us if you are able to :slight_smile:

1 Like

Lets see, I’m also searching!

as of now, I got this information from HTTrack but it’s outdated and it is not SSO login

Subject: Re: How to copy site with login?
Author: Daniel
Date: 01/19/2016 06:13

I don’t know if you have figured this out or not. I spent several hours
investigating this issue today and I was finally able to download my phpBB
site login as the administrator. Here is what I did:

  1. Using capture URL and filled in user name and password.
  2. Login to the phpBB site and use “Export Cookies” extension in Firefox to
    export the cookies
  3. Copy cookies.txt to the root directory
  4. Copy and paste Browser ID from the “hts-post0” file in the root directory
    to the Set Options–>Browser ID
  5. Exclude log out address in the Set Options. If you are using phpBB, the
    rule should look like this
  1. Because I don’t intend to download the control panel, I also exclude the
    link to the control panel.
  2. Finally, keep your Firefox open to the site you want to download

That’s it. Good luck to you!


If you are able to please share here too. I guess you can manually scrap page by page ( will take time but still … ) or make some script to do so. The udacity courses were scrapped in similar fashion so you can look into how they did it


Do you have any references for scraping?
after a google search, I found octoparse software.

1 Like

Thank you @TheJoker!:grin:

1 Like