Python script that generates local documentation by scraping the blog

Issue #343 resolved
Xin created an issue

This is not a proposal for the addon, but for Thomas to generate documentation pages with the attached Python script (read the README.txt first). Others can also try it if they want.

Then Thomas could upload the generated documentation as a packed .7z, and offer it alongside the release link, so users have access to documentation pages and blog posts locally, without needing an internet connection or having to deal with the ugly blogspot interface which makes searching for particular pages tedious (especially older ones). Also, blogspot’s connection is quite slow for me.

Currently, all the important blog posts on top of the current release documentation amount to ~50mb, including images.

Comments (13)

  1. Xin reporter

    Oh nice. I updated the styling a little and added the latest blog post too.

    Remember that the “downloaded_articles” directory is not needed and you can delete it. Its function is to store raw downloads of blog posts so if there is an error you don’t need to re-download everything. But once you are done (or if you want to re-download updated versions of blog posts), you can delete it.

    The new attachment has the new version.

  2. Xin reporter

    That was an old message. Don’t know why this issue was pushed to the front, maybe someone posted a message and then deleted it?

  3. Log in to comment