<< Go back to Posts

Baobab Demo for Neocities

Need to save space on your website ? First, look at what are the most space consumming items. Here, I coded a tool, similar to Baobab for Linux, where it takes as input any folder, including a website stored locally.



Introduction

The actual Neocities’ free plan gives 1Go of memory for free, which is more than enough for hosting a static website.

I have one personal website with many photographs about my projects. You can get a look at it here

With all high-quality images, my personal website reached 400Mo, which was still OK, but I preferred to start optimizing the space.

To optimize, you need to first to locate folders/documents requiring the largest amount of space. For that purpose, I created a visualization tool which mimic baobab Linux tool.

This article is a kind of readme and describes how to use the tool.

Table of Content

Tutorial

Set-Up

There is no need to install software. Nevertheless, you need to download the scripts and have python3 and bokeh ready.

git clone git@github.com:Japoneris/Neo-CLI.git

Then, move into the folder:

cd Neo-CLI

To get the image of your website, you need to get your API key.

  • If you have it (can be found on the Neocities website), then store it into API_KEY.txt file.
  • If you don’t have it yet, authenticate yourself with ./neocli.py auth. You will need to enter your Neocities credentials.

Getting the File List

Next, you need to list the items there are on your website plus their size. (The standard API of Neocities returns the list of files and folder. We simply store its output).

./neocli.py size --save

This command will store into the snapshots/ directory the files you have in your website.

It creates two files:

  • snapshots/T_<current_unix_time>.json: For historical purposes;
  • snapshots/latest.json: To get access quickly to the latest generated file.

Note: If you want to use the visualization tool for a locally running website or just to explore a local folder (as for baobab), there is a script called baobab/tree.py which enables to get the same format by running:

./tree.py --path=<path_to_the_folder_to_explore>

which will generate a .json file with the same data structure.

Generating the Map

Now we have the fingerprint of your website, we can move to the visualization part.

Go to the baobab folder:

cd baobab

Then, you just have to run the command:

./bokeh_baobab ../snapshots/latest.json

The script must open a webpage containing the result. This page is stored into baobab/html/ folder.

Demo

Main View

For my website, I obtained:

Here, you can see that my website does not need that much space (12 Mo). This is unrelated to loading speed which depends on <script> added here and there and other images, video, …

We can see that the largest folder is assets/, which is not surprising as it contains all images. Next, the book/ directory, which is all compiled _post pages.

Detailed View

To zoom on one particular folder, I created the following command:

./bokeh_baobab <my_website_tree.json> --sub_path=images

where --sub_path corresponds to any reachable folder in the tree.

For instance, ./bokeh_baobab ../snapshots/latest.json> --sub_path=assets gives:

Here, you better see that /assets/images/ is the largest folder, followed by /assets/fonts/

Conclusion

This was a short demo of the tool.

It allows to clearly see what are the major points to tackle to reduce memory footprint.

The code may evolve over time, to simplify some commands. Check the Github repository in case of a change.



>> You can subscribe to my mailing list here for a monthly update. <<