Logo

Cybergate9.Net

Open the gate.. welcome to this dimension..

Home

Rebuild22

Witterings

Commentary

Shorts

Essays

Ephemera

Taking stock..

⊰ 2022-07-15 by ShaunO ⊱

Well.. a bit of taking stock and 'cleaning up'..

CSS, Responsiveness, & Design

I'd forgotten how frustrating CSS can be. It takes a couple of hours to get 'into CSS think' (grids, blocks, floats, margins vs padding, and tracing cascades..). Anyway, on a quick inspection after my last post it turned out that the goal of 'screen size responsive' was very broken. So much work to rectify that, as well as a first pass on making the design 'not suck too much'..

I've gone back to Tufte's fonts which I found years ago and really are very pretty, as well as eminently readable (even though they were designed for books they have been tweaked to be nice 'on screen'). Edward Tufte has, unlike me, many design bones in his body, in fact you would say he's a design expert. The fact that we share a long past interest in TeX and LaTeX also appeals.

So the design is tolerable down to 320x480 pixels now.

Responsive view at 320x480 pixels

Profiling

At this stage, having noticed the odd 'pause' on page loads, it seemed like a reasonable idea to look at whether there were any 'code bottlenecks' now, given the fairly major surgery which has been performed on the framework for this rebuild.

The first thing which came to light was the differences between require/include_once() and their plain cousin. Some chatter about performance differences but this doesn't seem to be the case. So it was relatively straight-forward to fix - decide to include all modules in 'base' install of SF_Siteframework (particularly adding SF_cache.php and SF_urlpreviews.php) and manage loading manually with straight "require()'s" in the correct order in the correct places.

Having only ever done basic 'time testing' using microtime() in the past I thought I should probably look into the current best practice on profiling. Turns out, not surprisingly, there's good infrastructure for this using xdebug, which amongst other things can profile into 'Cachegrind' files, which in turn can be viewed using a very simple web frontend called Webgrind.

Some of the 'old gems' (single quoted strings rather than double quoted etc.) which were used in PHP5 as efficiency speed ups are no longer relevant because they've been 'fixed' in PHP version 7 and 8 developments apparently. And locally and on my cloud host we're running PHP 8.1.x. So most of the 'excess time' seemed to be inappropriate use of preg_match/replace(), strcmp(), array_key_exists(), and the like, where simpler functions (isset(), str_replace() etc.) are much 'cheaper' alternatives. And having spent an hour or so looking through, and doing obvious like for like replacements, where appropriate, it has certainly shaved 100's of microseconds off certain blocks of code, and where these blocks are called multiple times (e.g. breadcrumbline code is called four times every page) this soon adds up if you find a few of them.

The really 'expensive' remaining stuff is where you'd expect it - in the Parsedown library, and in something like shorts.php where there's multiple array manipulation and sort operations..

So, am calling performance profiling checks as 'done' for the time being.

Shortcodes

Shortcodes have had quite a bit of exploration and it turns out are best generalised to a very high level.

PHP has the ability to store a function name as a variable. This essentially means at the very top level of a 'shortcode processor' can simply look at shortcodes on a page and store those it recognises (i.e. {{f:shortcode; var:value; var1:value1}}) and ignore those it doesn't.

Then for each stored shortcode you define and call its corresponding PHP function when you 'see it' - easy!

It's very clean, each shortcode function exists in its own PHP function - receiving all the contents (fields:variables) of the shortcode and returning the desired html+css+javascript output as required. The upper level shortcode processor takes care of doing the 'replacements' within the source markdown.

For efficiency I've decided that shortcodes will not process on every page - shortcode processing needs to be requested on a page by page basis by putting 'shortcodes: yes' in the markdown front matter.

It's probably not that inefficient to do shortcode processing on every page, but I'm a bit of purist and it seems wasteful to run what is somewhat 'expensive' code when it may only be required on a fraction of pages. It's ok this way too, from a future proof perspective, because if in future the 'shortcode: yes' is not required (because I decide shortcodes will run on every page) it will simply be ignored.

Enough for today..

Other posts on 'rebuild2022'


you're at: Home > Witterings