The actual Neocities’ free plan gives 1Go of memory for free, which is more than enough for hosting a static website.
I have one personal website with many photographs about my projects. You can get a look at it here
With all high-quality images, my personal website reached 400Mo
, which was still OK, but I preferred to start optimizing the space.
To optimize, you need to first to locate folders/documents requiring the largest amount of space.
For that purpose, I created a visualization tool which mimic baobab
Linux tool.
This article is a kind of readme
and describes how to use the tool.
There is no need to install software.
Nevertheless, you need to download the scripts and have python3
and bokeh
ready.
git clone git@github.com:Japoneris/Neo-CLI.git
Then, move into the folder:
cd Neo-CLI
To get the image of your website, you need to get your API key.
API_KEY.txt
file../neocli.py auth
. You will need to enter your Neocities credentials.Next, you need to list the items there are on your website plus their size. (The standard API of Neocities returns the list of files and folder. We simply store its output).
./neocli.py size --save
This command will store into the snapshots/
directory the files you have in your website.
It creates two files:
snapshots/T_<current_unix_time>.json
: For historical purposes;snapshots/latest.json
: To get access quickly to the latest generated file.Note: If you want to use the visualization tool for a locally running website or just to explore a local folder (as for baobab),
there is a script called baobab/tree.py
which enables to get the same format by running:
./tree.py --path=<path_to_the_folder_to_explore>
which will generate a .json
file with the same data structure.
Now we have the fingerprint of your website, we can move to the visualization part.
Go to the baobab folder:
cd baobab
Then, you just have to run the command:
./bokeh_baobab ../snapshots/latest.json
The script must open a webpage containing the result.
This page is stored into baobab/html/
folder.
For my website, I obtained:
Here, you can see that my website does not need that much space (12 Mo).
This is unrelated to loading speed which depends on <script>
added here and there and other images, video, …
We can see that the largest folder is assets/
, which is not surprising as it contains all images.
Next, the book/
directory, which is all compiled _post
pages.
To zoom on one particular folder, I created the following command:
./bokeh_baobab <my_website_tree.json> --sub_path=images
where --sub_path
corresponds to any reachable folder in the tree.
For instance,
./bokeh_baobab ../snapshots/latest.json> --sub_path=assets
gives:
Here, you better see that /assets/images/
is the largest folder, followed by /assets/fonts/
This was a short demo of the tool.
It allows to clearly see what are the major points to tackle to reduce memory footprint.
The code may evolve over time, to simplify some commands. Check the Github repository in case of a change.
>> You can subscribe to my mailing list here for a monthly update. <<