Electric skateboard builders forum back online *Now start saving content*

without enertion extracting the data there is little hope

1 Like

when i was backing up pages and pages of safety related information about axle stress that got deleted from this forum in the hummie kickstarter thread I used good old screenshots

@longhairedboy Are admins the only people who are allowed to make a discourse backup?

This. Actually we should post a tutorial how to do that and pin it so everybody can help. Everything gets saved as HTML Page complete and then we can build a start page with links to the saved content.
edit: I’m building a animated Tutorial rn, just in case someone else also is thinking about it :smiley:

1 Like

When I get back to my windows compter ill see if i can find a way to save it but no promises

  • Scroll to the top
  • Go to the address bar, scroll to the left (or jump to the end ;))
  • type “?print=yes” and press enter
  • wait
  • dismiss the print window
  • choose “save page as”
  • choose location, in case it isn’t set change to “web page complete” in the lower right
  • save
4 Likes

Does it save subpages?

Does anyone have experience with crawlers? I can volunteer large amounts f archive space

1 Like

onloop is drinking whiskey and laughing everyone lol

1 Like

Yes, everything is going perfectly in his world.

no, only the thread, it doesn’t follow links or what do you mean by subpages?

1 Like

Thanks for this

Saved a few of my threads, but my phone can’t handle opening the big one in its entirety, hope the site stays up until I can get home

Well, shit, even printing has a limit

If you really want to get a backup, use this.

http://www.httrack.com

Keep in mind that it is massive and Enertion pays primarily for traffic, using this will up the traffic usage.

4 Likes

So…everyone needs to use this at the same time?

2 Likes

Are you sure it works for Discourse? Its a JS based forum, it dynamically loads content. I just tried httrack and the stuff it already got doesn’t look promising.

Btw, my tutorial has a flaw, on huge threads theres a link “next page” on the bottom, you have to follow that too and save again under a slightly different name (eg “_1”)

1 Like

It works, Ive used it. Ive used it both in the past and currently on discourse installations.

LOL, prolly not :slight_smile:

Did you use any special settings? I’m using HTTraQT, a QT frontend to the tool.

Yeah dont do that you will DDoS the site lol

I used to have good results with Kali and using wget commands, you have to take it slow or use proxies or the site will ban your ip.

You can also use something like burpsuite free addition and crawl the site and scrape the urls

If the site goes down go here and you can download a lot of crawled webpages, there is software that will scrape that site and rebuild it best it can.

3 Likes

i didnt, using the win version if that matters.