LATEST ENTRIES

Click the post title for a direct link


The framework's dead simple template engine was reworked to expand on it's modular design. Namely, it now keeps track of the final HTML string in a stack and renders everything in a single, uninterrupted call. The content stack is filled using addFile() (adds and prepares a file for writing) and addString() (adds and prepares a string for writing) which completely replaces the previous functions. Once filled, the internal content stack is finally sent to the client using render().

This helps smooth the server to client content parse since it's sent in a sequential matter without interruptions. By writing the complete stack to the socket all at once the potential for performance hiccups and browser related issues greatly diminishes as the server load grows and the time between writes potentially increases. Previously, it was possible to break the parsing sequence by sandwiching unrelated operations between renderHeader() and renderFooter(), which start and end the writing process respectively. Though this was never the intended design, as the projects relying on pocket_php diversify and the use of its more elemental components get tested from different angles, the limitations of the overly restrictive templating interface became apparent. Leaving services like REST APIs with little choice but to modify the core class in ways that don't disrupt its intended functionality, a clear design flaw.
The new TemplateEngine class

class TemplateEngine
{
    private $contentStack;
    private $stackCounter;

    function __construct ()
    {
        $this->contentStack = array();
        $this->stackCounter = 0;
    }

    public function addFile($filename, $data = NULL)
    {
        $fileContents = file_get_contents(VIEWS_DIR . $filename);
        if (empty($fileContents))
            return;
        else
            $this->addString($fileContents, $data);
    }

    public function addString($string, $data = NULL)
    {
        if (empty($string))
            return;

        if ($data != NULL && is_array($data) || !$empty($data))
            $this->contentStack[$this->stackCounter] = replaceValues($string, '{{', '}}', $data);
        else
            $this->contentStack[$this->stackCounter] = $string;

        $this->stackCounter++;
    }

    public function render()
    {
        if (!ob_get_status()) // IF ob hasn't been manually started by the user
            ob_start();

        foreach($this->contentStack as $section)
            echo($section);

        ob_flush();
    }
}
The core/utility.php file was also overhauled. The template engine keyword replacement function replaceVariables() was substituted by replaceValues(), a more flexible version that can parse keywords between any two different string limiters to encourage its use outside TemplateEngine. Basic directory management, a very general requirement to deal with user uploads was also added in the form of newDir() & delDir(). Check out the core/utility.php commentary on how to speed up directory deletion using system calls.

It's ridiculous how much has changed in the past thirty years. Life has warped into an unrecognizable mess that resembles the post-apocalyptic dystopian fiction of the last century a little too much for my liking. Spiritually subversive propaganda, pharmacological behavior modifiers, weaponized social pressure and shaming, disinformation distribution, secretive psyops and falseflags, artificial intelligence guided communities, openly totalitarian corporations, instigated irrational social and racial tensions, substitution of the nuclear family with governmental programs, psychological undermining of individuality and critical thinking...
The list of similarities is well beyond that of mere coincidences, and yet, against all reason, it goes largely unnoticed.

The very basic mechanisms of organic human interaction have been profaned by irrational fear and ego driven ignorance long enough for the natural imperative to protect one's way of life to slowly decay until it is completely replaced. Whatever cultural artifacts that remain are mere vestiges of the past that, at best, find themselves renewed as twisted marketing assets for opportunistic corporates, and at worst, another deceptively selected target for the misguided masses to attack. Establishing a self preserving vicious circle of hatred towards the very ideas and traditions that nurtured humanity up to the presented day, further straining the already crumbling social cohesion and encouraging isolation in an already apathetic society.

Those most unfortunate souls that have been born after the past decade or so will be the ones that suffer the most. Their mental model of reality will be tainted by the coldness and indifference we've embraced during the most critical period of their development, they will learn from a very young age to accept their place in the new world as hedonistic slaves and forgo the endless potential inherent to their being.

POCKET_PHP has been updated with extra session management features. For starters, the database table for accounts and the provided HTTPRequest object now include the following session information:


session_timestamp: UNIX style timestamp of the session start
session_id: The UID PHP assigns to the user at session start
logged_in: Flag value that reflects the state of the session using this account
logout_type: How the session was last terminated


With this data the framework can provide a more thorough status of the account assigned to each session. This is necessary because PHP manages sessions by storing them in either a database or as a file in the local file system (default), then sends the client this same UID as a cookie. When the client connects to the POCKET_PHP™©® server the browser (by default) sends whatever cookies match the target domain along with the actual HTTP request, the server then matches the cookie's UID with it's internal sessions and, if it finds a match, proceeds to assign the account data to the $_SESSION superglobal. Linking the client's session with an account in particular.
As long as the client keeps the session cookie and the server keeps the session alive (be it as a local file or an entry in a database) the user will be recognized by the server and automatically assigned the sessions data on each request.


As simple and efficient as it may seem, the codependency on the user's cookie and the server's session registry does further complicate the problem. Neither side has any control or notion about it's opposite SID container. If the server decides to delete a user's session (i.e. delete the correspondent file from the sessions directory) then the user's cookie can't be verified and thus, becomes useless. Likewise if the user decides to manually delete or tamper with the cookie (or it simply expires) then no ID is sent with the user's requests, the server-side session file then becomes useless. In the case of the former, the client's now invalid cookie is discarded and a new one is generated, in the case of the latter, however, it's the server's session file that is now invalid. It will simply generate a new UID, store it as another session file and send it back as a cookie to the user, leaving the user's previous session file pointlessly waiting for activity.


The way PHP deals with these potentially obsolete files is by having a chance of triggering a garbage collection routine every time session_start() is called. The probability of it happening is calculated by divding the php.ini settings session.gc_probability and session.gc_divisor. By default, PHP assigns these values to 1 and 100 respectively, giving the garbage collection a 1% chance of being triggered by calls to session_start(). This routine checks the existing session files and deletes them based on the expiry time set by session.gc_maxlifetime. Needless to say, each file has to be individually checked for expiration making this a linear search problem with a (O)N performance rating, meaning that the more session files there are, the more computationally expensive the routine grows. Hence why PHP defaults to a 1% chance of calling it, and only during session creation (perhaps the least intrusive step to add more overhead to).


In a scenario where the client's cookie is prematurely disabled and the server side SID file remains, the account associated to the session won't be properly logged out, even the internal session life counter won't be recognized as expired because the client can't match the session file to trigger this check anymore. This means that if the client (or anybody else, for that matter) tries to log in to this account, POCKET_PHP will recognize it as already logged in and must either reject the latest login attempt or reassign the account data to the newly created session. Either way the previous session file should be manually destroyed to avoid having to wait for the garbage collector, hence why the SID assigned to an account on login is stored in the accounts table. Should a login clash occur, POCKET_PHP will attempt to manually destroy the session file saved in the account entry and replace it with the new SID. The account's logged in status then correctly reflects it's pairing with the new session and it's previous session identifier no longer hogs memory by uselessly sitting in the sessions directory. It also updates the "logout_type" cell with "SESSION_HIJACKED".


Do note that a "SESSION_HIJACKED" simply means that the account shifted sessions without logging out, this is also the case when the server side SUI file is deleted or expires.


Of course, the other ways of ending a session are also reflected in the account's entry.


CLEAN_LOGOUT The user manually logged out
SESSION_INACTIVITY: The user (with a valid SUI cookie) refreshed the page after the inactivity tolerance (specified in SESSION_INACTIVITY_TOLERANCE) was over
SESSION_MAX_DURATION: The user (with a valid SUI cookie) refreshed the page after the session max duration (specified in SESSION_MAX_DURATION) was over
SESSION_HIJACKED: The account's assigned session was reassigned without logging out


This extra control enables the administrators to make better informed decisions in regards to suspicious accunt activity, an account that continuously has it's sessions hijacked is a clear sign that something's wrong, in contrast with the occasional hijack which may simply be a user that forgot to logout of his account in his desktop computer and relogs to the same account from his phone before the garbage collector can clear the previous session. A smooth transition between sessions that offers a much better user experience than outright rejecting the login attempt.


The app/configuration.php file has been updated to include settings for the session management changes, including a new session directory (app/tools/sessions) to store the SID files in a virtual host basis, this prevents PHP from piling up all the hosted sites SIDs into the same folder and the php.ini global settings from messing with individual configurations. Sessions themselves can also be toggled on and off from the config file, if enabled alongside the included request tracker (TRACK_REQUESTS), they are also added to the tracked data.


Finally, the included example site has been repurposed as a landing site and as a feature tester, keeping the project information updated in three different domains (xenobyte, the project's README file and the example site) was getting tedious, not to mention redundant. It will instead serve as the default landing page for POCKET_PHP installations.


New home and settings pages

There are still a couple of relatively minor tweaks and additions I'd like to make before releasing a few more example projects as previously planned. But with my current workload I'm just glad I got this update through.

[#100] [24/06/2021][misc]
// Legends never die

The life of John Mcafee is one that I've been following since I first learned about the man and his astonishing shenanigans. A true case of the eccentrically brillitant rebel with a background so colorful and extreme that, were it not so well documented, it wouldn't be at all believable. Even for a fictional character.

For a free spirit like him, death should be nothing special.
R.I.P the coolest programmer

I constantly complain about the lack of website diversity in the modern internet. Sadly, the last remnants of the free, community driven internet have mostly faded into a concept of the past. Simply put, corporate influence over the infrastructure runs deep enough that, were not for the scattered efforts of dedicated enthusiasts, the internet would be completely devoid of it's original potential, becoming indistinguishable from any other lifeless, monopolized, endeavor.

Surviving in cyberspace as a legitimately free and autonomous community requires total commitment to the engineering, administration, maintenance, legality and moderation of the platform, such that only those that manage to turn in a profit that justifies the effort and risk have any chance of survival. Even then, the myriad of ways that the all-powerful competition can undermine smaller platforms makes it implausible for anything but the most sophisticated, underground efforts (onion markets, private decentralized messageboard, etc.) to have a fighting chance. Digital platforms on the mainstream internet are, therefore, always at risk of being targeted and dismantled, further discouraging the rise of new online ventures and keeping the ones online in check. Step out of line and you'll be silently culled.

Decimated as the autonomous domains have gotten, there has been a comparatively small, yet symbolically victorious resurgence of anti-conformism with the current state of affairs by the programmers and engineers that the monopolization of the net has displaced. The few, independent platforms that still hold some relevancy have made it this far partially thanks to the collective, organic efforts of their respective administration and userbase in defiance of the dystopian alternatives. However, finding such communities is becoming an increasingly difficult challenge. In particular for the newer generations that despite being born in a post-internet world, rarely venture beyond their mainstream prisons and thus, have been conditioned to rely on corporate solutions for what, in reality, are very basic services. To make matters worse for these unfortunate individuals, their mega-corporations of "choice" are most definitely, ruthlessly undermining their clientele by abusing their trust and ignorance. Often through sophisticated means that most are not willing to even believe real.

Nevertheless, the aforementioned digital realms where the last exchanges of unfiltered, untainted ideas still happen remain open, eagerly awaiting new participants to nurture it's community. A good example of such platforms is lainchan. A cyberpunkish, anonymous imageboard named after a surprisingly introspective if somewhat niche anime that revolves around discussing related subjects. Not particularly known for the stereotypical shitposting inherent to anonymous communities, it's one of the last bastions of cyberpunk culture worth participating in, at least in the clearnet. It's publicly available and requires no registration to post, making it an easy target for hackers, trolls and shills, despite this, the board remains relatively clean, probably due to it's moderate popularity and organized moderation. Regardless, responses on anything but the most active threads do take their time to pour in, it's no reddit or 4chan in terms of sheer population density, that's for sure. It's to be epxected since it's pro-freedom / anti-censorship attitude gets invariably associated with some recently conceived social taboos (an excellent way of reinforcing mainstream, sterilized preferences) that filter many, if not most, potential new users.

Those that do stay around tend to be more than just wandering cybernauts, a good deal of them can be considered aficionados to at least one of the discussed subjects. Such users are among the best additions to any community because they have the drive to participate in the current discussions, and it's from the mix of attuned veterans, eager new members and the liberty to exchange ideas that solidifies the platform's culture, crucial for the long term success of any collective. It's the unwritten rules of the game that ultimately serves as a catalyst for the personality that will be associated with the brand, hence why the mainstream alternatives all seem to complement the same superficial image of sterility and safety that encourages it's users to stay within corporate influence. Lest they learn a wrong opinion or two.

For the past half year or so, I've been following a series of threads on the lainchan imageboard centered around creating an anonymous webring of sorts. The idea behind is to unite the many personal webprojects that remain scattered throughout cyberspace into a decentralized confederation of related sites. A graph where every node is it's own independent realm with a section that links to other such places, removing the need for any centralization while retaining the common context of the projects. I've yet to add the more recent ones but I'm confident that most of the participating sites have been added to the links section by now, alongside many other unrelated links.

Current websites in the webring

The chain of threads (currently on it's fifth iteration) span hundreths of posts and around 60 or so participating domains, each an individual entity somewhere in the vastness of cyberspace, linked to the rest of the mesh by will and interest alone.
Needless to say, these types of communities are very reminiscent of the pre corporate internet, before "social media" giants held a monopoly on information exchange and finding like-minded netizens required extra effort and technical skills. Each new site first had to come across a webring, usually by surfing around forums or imageboards of interest, then joining the community and eventually the webring itself. This clever use of third-party platforms to strenghten individual domains, although not that common of an occurance mainly because setting up a personal website acts as a filter, has been bridging personal projects since the early days of the web.

It's of no suprise that these same users contribute all sorts of interesting ideas to their respective circles, some of the anons in the lainchan webring even started experimenting with scripts that illustrate the current state of the webring's matrix as a graph node.

The original concept post

A second poster then uploaded a graph generated by his own script.

A more readable and hilarious graph (post)

A quick glance at the graph makes it apparent that the xenobyte.xyz node introduces many unrelated links that the script still counted as part of the lainchan webring. To facilitate anon's efforts, I created a separate link that hosts only the lainchan webring links right here.
I'll make sure to contribute something cool of my own in the future.

All the projects I've worked on that rely on pocket_php and make use of POST requests to process user provided data require checking if the request was indeed sent as POST. The HTTPrequest->arguments member variable was supposed to abstract this away by only parsing the POST data into the request arguments array if the request was indeed a POST, if it happened to be a GET request containing data that would otherwise be sent by POST, then the engine would discard those form elements. Not a big deal since it's mandatory to check if the data is even present in the first palce, but the monolithic arguments container still loses the intended context by mashing both GET and POST arguments into a single entity.

The HTTPrequest->arguments variable has been replaced by HTTPRequest->GET & HTTPRequest->POST respectively, this means that all and any inline URL arguments will always be present in GET despite the request having been originally of type POST and the POST argument array will only be populated on POST requests.

Minor change with a big impact in readability.

I still have a beefy backlog to clear but I hope to upload a few pocket_php backed websites as documentation in the coming months.

If you've ever managed any kind of online service you're well aware of how easy it is to abuse unprotected forms. A simple python script can wreak havoc by bombarding a server with randomized input that has to be carefully sanitized to prevent injections. Said processing, however, is typically the most computationally expensive request for a simple website to honor due to the complexity of the steps involved; sanitizing the user input, validating the cleansed data, writing it to a database, etc. Thus, unless we know for certain that the form we received was legitimately answered it is best left discarded, lest we neglect gate-keeping the database only to find it littered with nonsense or worse.

pocket_php captchas samples
The previously mentioned script that fills the form with randomized characters wouldn't be too difficult to detect and mitigate, but a slightly improved version that generates a well formatted email address (regardless of it's authenticity)? Not so much. Even potential solutions, complicated as they mey get, would most likely only work when validating the email field, not the rest of the form which would probably require their own specialized routines. The resulting increase in resource consumption doesn't even ensure the processed form was legit in the first place, only that it passed the aforementioned filters.

For the sake of brevity, now that the problem has been illustrated I'll jump to the point. This is no easy problem to solve, but there are simple and effective precautions that universally apply to internet forms that will at least help deter automated injections, namely captchas.
The idea behind them is quite clever, exploit the fact that there are easy to generate problems that a computer still can't crack but a human can solve in an instant. With character recognition tests being among the more popular.

As for the inevitable cost, captchas may be straightforward but are certainly not free, at least in contrast with leaving a form unprotected, and in cases where the server is working at or near capacity having to generate captchas would definitely worsen the service's responsiveness. On the opposite scenario where the server is practically idling it's quite likely that working the captchas becomes the most expensive step of the form's validation anyway. Alternatively, it's common practice to outsource captchas to reduce the local workload at the expense of the user's privacy.
whichever you choose, safeguarding forms automated attacks has a price, but it is practically ALWAYS WORTH PAYING.

The captcha functionality added to the pocket_php example login page is very simple and effective. It randomizes a set amount of characters from a given input string, draws randomly generated squares to the randomly colored background to obfuscate the foreground, renders the selected characters at a random angle, position and color (within reason, it'd be redundant to make this hard to answer for humans) and finally, the generated string is stored in a PHP session variable to subsequently validate the client's answer. Should the created captcha be too difficult to read for a human, all the client has to do is ask for a new one either by pressing the refresh captcha button or by reloading the form. Ezpz.

Note that the internal login captcha can be (de)activated by setting the ENFORCE_LOGIN_CAPTCHA in app/configure.php and utilizes the php-gd library.

Local, private, effective captcha

Before ending the post I'd like to clarify why captchas were added while other utilities get overlooked from the project. Pocket_php is currently powering fourteen web services with more planned, working with these individual projects I often get tempted to add a particularly handy snippet into the pocket_php codebase for future use, only to scrap the idea in favor of it's original vision; to provide the fastest, simplest template for webprojects to use as foundation. Abiding by this rule means that certain kinds of utilities are often not incorporated into source due to them being either too situational or just not worth the effort to implement on behalf of the programmer.

What separates captchas from other potential features is how necessary they've become and how widespread the use of (often subversive) third-party captcha services has grown in response, specially among sites that don't really benefit from such an approach at all. In the end, it's not about how easy or fast it is implement captchas, it's about having the option of privacy at the same reach as the alternatives.
[#95] [12/04/2021][misc]
// Gray morning

We're almost halfwy through the year already in what feels like an instant.



Earlier this month, we agreed to help a video game studio with their upcoming MMO by developing a networking tool that aids the debugging and testing of their server's many features. In my opinion, the coolest request we've had yet.

Now, I've been meaning to write a few tutorials about some of the subjects that, in my experience, give aspiring programmers the most trouble. But 2021 has so far been a very busy year with no signs of slowing down, since I can't really spare the time to start yet another secondary project, I settled on trying to be more thorough as to how it is that we get actual engineering jobs done and documenting the relevant parts of the programming process in a tutorial format. Thankfully, the client agreed to release the finished software for free, making it a great opportunity to test out the idea.

Since pretty much all modern operating systems concern themselves with providing only the absolute minimum through their networking API, it befalls to library developers to implement a more robust solution for programs to then use as an intermediary. Features like multithreading, OS API abstraction, data serialization, packet queuing, SSL / TLS encryption and connection lifetime management are some of the more common demands of a typical, modern networking app. As for the sockets themselves, both the UNIX and Windows interfaces are very much the same but with enough minor differences to merit some abstraction, ironically the simplest problem to solve. The others? Not so trivial. Thus, we'll be employing the ASIO networking library as foundation.

Like it's name implies, the library focuses in providing "Asynchronous Input & Output" that can be used in a variety of ways, networking and serial communication being among the more common. Socket centric projects, in particular, don't really have much of a choice but to adopt an asynchronous approach to combat the aforementioned problems, not so easy in the 90s but in actuality, tools like ASIO and modern C++ optimize the process quite nicely. Hence the birth and raise of independent, high performance services, like private MMO servers (of which ASIO has sponsored many) and crypto trading bots have helped cement ASIO as arguably the best networking library for C++. So much so that I wouldn't be surprised if it (eventually) gets incorporated into the official C++ standard, not unlike a few other liboost projects.

The main idea is to get around the unpredictability of sockets by calling the relevant functions from a different thread to prevent pausing the rest of the program, and even though ASIO makes this (relatively) easy there are still a few details to consider.

The client also specified that the program will be run on virtual machines without a GUI, and given the I/O (not to mention A5STH5TIC) limitations of a vanilla console, ncurses was added as a dependency. Even though it's older than me, it has yet to be trumped as the king of console UI libraries and with good reason. It's very efficient, fast, tested, documented, easy to incorporate and likely already installed in whatever UNIX system the client will be using. However, it has absolutely no multithreading support. Working asynchronously with multiple threads can trigger memory related crashes if an ncurses function is called from anywhere but it's respective thread. A critical limitation for a tool that requires multiple threads to work but it's by no means impossible.

Consolidating the threading discrepancies between ASIO's internals and ncurses is all about discerning how is it that ASIO will (asynchronously) communicate with ncurses to make sure this only happens in a predictable, thread-safe manner.
This extends to the internal I/O buffer storage and packet queue as well, any thread parsing a packet into a queue for incoming data must do so in harmony with the rest of the threads processing the ASIO io_context that may trigger an operation on the queue while it's already in use. So, for starters, we have to make note of where such events will happen.

The graph pinpoints the critical sections of the executable and their relationships, starting with the beginning of the main thread (top left). As previously stated, all ncurses functions have to be called from the main thread or risk undefined behavior, luckily there are only three relevant functions out of which only one should be accessed by secondary threads; the window output / system logging function. This is the only access to the window's output available, meaning it's internal container must be protected to prevent simultaneous writes by the rest of the threads.

Do note that even though the UI drawing function will be invoked solely by the main thread, the container storing the window's output could be modified by a secondary thread as the draw function is processing it. This container has to be protected during UI processing as well or it could be updated as it's being iterated through. That's makes two sections of the main thread that must be accounted for when writing the program. As for the rest of the threads, the relevant action will be happening between a handful of functions that interact with the socket themselves. Requesting or accepting connections, reading from and writing to the sockets, processing requests and logging through ncurses, it will all be done asynchronously from any of the secondary threads.

Now, what exactly is it to be done to protect these vulnerable data structures? The most common approach is to use locks and mutexes to limit access to the protected resources to one thread at a time. Despite not being the fastest spell in the book it can still perform at a superb level as long as the locking period of the mutexes remains as short as possible. Another key reason as to why performance sensitive netcode should always be planned ahead of time, if we design a multi-threaded program around locking and waiting but fail to account for the workload of said function, we risk bottle-necking the tool's performance by often difficult to change code.

In our case, the only procedure that risks such behavior is the request resolution function, the rest shouldn't be doing much more than adding or removing a small object (representing a packet) to the connection's internal packet queue, but resolving requests may entail more complex behavior that wouldn't play well with lock based designs. Still, so far we've covered most of the expected pitfalls of working with high-octane netcode in the context of our project.

I'll cover the source code in the next post, this one has grown long enough.

Got a project request from a game developer asking for a tool to thoroughly test custom, TCP based network protocols to help debug a developing MMO server. I was looking for an excuse to write a similar program to spice up the next SkeletonGL game and the client was happy to open source the final product, so I accepted the request despite the difficulty and time constraints. After all, a good chunk of our business involves these kinds of specialized, performance focused solutions. Few are this interesting, though.

Having had worked with game servers in the past I was well aware of how projects involving sockets tends to be either very fast and straightforward or a hard to debug, convoluted puzzles that demands a solid foundation of software engineering principles from the programmer and an efficient design philosophy to overcome the many pitfalls of bad netcode.

Getting around the limitations of the operating system's socket interface is not as trivial as their simplicity may initially appear, though much of the challenge lies in the dynamics of networking itself rather than interface shortcomings. A TCP socket, for example, may terminate a connection at any given moment by different means, suffer unpredictable time delays between endpoints in a synchronized system or slown down to a halt serving a client with a particularly shit connection that bottlenecks the server, maybe someone tries to connect using networking tools (like the one requested) to mess with the service. The list of potential critical scenarios goes on.

With that in mind, here's a quick rundown of what the client asked for:

  • - Traffic simulation and analysis tool
  • - Support for custom, TCP based networking protocols
  • - IPv6 support
  • - Each instance must be able to act as a server as well
  • - Console based GUI, will be mostly run in a VM terminal
  • - Support for as many threads as the CPU allows
  • - SSL / TLS 1.2 support

It's apparent that the idea behind the request is to essentially outsource the aforementioned netcode issues into a templated solution that can be easily modified to fit their testing needs.
Since this is one interesting project I'll upload another blog post detailing the design and development process soon enough.

Lost another client to the failing economy this past week. A small phone and laptop repairing business that I used to host a few services for was unfortunately claimed by the current economic circus, namely the recent cost spike in rent and basic services and their inability to compete with global brands. At least the owner managed to find employment shortly after, if that can even be considered advantageous anymore.

Workloads are at an all-time high yet the greatly increased productivity does not reflect in the quality of the employee's life, or even it's overall time on the job. On the contrary, despite the nigh miraculous scientific and technological leaps of the past century we find ourselves in the ironic position of a deprecated asset on its way out. One that isn't even worth replacing with dignity and that, ultimately, has little to no repercussions to a monopolized and corrupt market.

At this point I can't tell if this purge of independent efforts is part of the price to pay for genuine progress or just another victim of predatory corporations.


Deep within the more inquisitive internet communities, beyond the reach of censorship where the net's original anarchy still reigns supreme, a group of anonymous freedom enthusiasts have been nurturing a very interesting rumor concerning the bitcoin blockchain that has recently gained traction in the mainstream cyberspace. I'm well aware of the countless conspiracies already circulating but unlike most, this one has enough objective merit to warrant suspicion.



Some basic knowledge of how bitcoin operates is required to fully appreciate the gossip, so for those unaware of how the btc blockchain technology works, it's basically a program that maintains a log file of transactions across a network, this oversimplification ignores a lot of details critical to the bitcoin environment but for the context of this blog post what matters is that:

1. The bitcoin blockchain contains the entire BTC transaction history
2. This blockchain is preserved across all participant nodes
3. It is possible to send bitcoins to an "invalid", personalized address


Bitcoins sent to these fake addresses will be lost, nevertheless they are still a transaction that must be logged, appending the transaction details (including the fake output addresses) to the blockchain and distributing it across the entire mesh. Valid bitcoin addresses are generated from 256-bit private keys as public keys hashed to 160-bit (or 20 byte) string that is then stored in the blockchain as hex and can only be redeemed by the original private key.

This also means the blockchain isn't responsible for ensuring that a certain output address can be indeed accessed, it's the users responsibility to provide an address he holds the private key to. Providing any unique combination of 20 bytes as a destination address is perfectly valid though it would lock any bitcoins sent to it since the blockchain requires the private key used to generate the address to redeem said coins, such is the price for storing 20 bytes of (unique) data in the BTC blockchain.

In a typical bitcoin transaction, decoding the resulting output addresses from hex to ASCII shouldn't amount to anything but a string of seemingly random alphanumeric characters. That is, unless said address was generated by encoding a custom, 20-byte ASCII string into hex using Base58Check, in this case the decode would produce the original ASCII text.
This means that, as long as you don't mind loosing a few BTC credits, you can insert 20 bytes of custom (and unique) data into the blockchain. Preserving it for posterity by synchronizing across all participant nodes as a decentralized, uncensorable, nigh untamperable public database.

Decoding address "15gHNr4TCKmhHDEG31L2XFNvpnEcnPSQvd" reveals the first section of a hidden JPG



Taking the previous rundown into consideration, it's to be expected that some bitcoiners use the blockchain to store and secure small amounts of data in creative ways. A practice that is so aligned with the blockchain's principles that Satoshi Nakamoto himself famously embedded the following string into the very first bitcoin block, the "genesis block" (though it was added to the coinbase since he mined the first block, not the previously explained method).

Satoshi Nakamoto's message embedded in the genesis block

It's at least evident that he designed bitcoin to allow for arbitrary data to take full advantage of the technology, serving both as an alternative to our current FIAT overlords and as a digital wall for the community to graffiti. So, when I learned that "someone" had supposedly hidden some very condemning blackmail of very important and influential people in the bitcoin blockchain I was more than willing to believe it. In fact, Wikileaks had previously done something similar by embedding a 2.5MB zip file full of links to the actual data, though it was done in a rather obtuse way to prevent data scavengers from stumbling into a cache of leaked secrets.

Like this 'lucifer-1.0.tar.gz' file found in TX hash # aaf6773116f0d626b7e66d8191881704b5606ea72612b07905ce34f6c31f0887

Today, there is even an official way to add messages to the blockchain by using the OP_RETURN script opcode. Bitcoin was to be used as a broker of both credit and information from the very beginning.

If you'd like to 'mine' the blockchain for secrets, I wrote a simple script that decodes all the output addresses in a transaction, it queries Blockchain.com so you don't have to download the entire blockchain. As previously explained, this alone may not be enough to peel away all the potential layers of obfuscation possible, nevertheless it is enough to take a quick peek at what the bitcoin community has been sneaking into the blockchain.

Bitcoin logo added as a two part JPG in ceb1a7fb57ef8b75ac59b56dd859d5cb3ab5c31168aa55eb3819cd5ddbd3d806 & 9173744691ac25f3cd94f35d4fc0e0a2b9d1ab17b4fe562acc07660552f95518


ASCII tribute in TX 930a2114cdaa86e1fac46d15c74e81c09eee1d4150ff9d48e76cb0697d8e1d72

Just what could be hiding in the blockchain?
[#90] [01/01/2021][misc]
// A new decade

Argos usually wakes me up before sunrise but it seems that even he felt last night's chill. Fortunately, the first morning of the new decade was a warm one despite being halfway through the damp winter. Can't complain about a comfy morning in these times of hollow hope and gray skies.

[#89] [31/12/2020][misc]
// New Year, new pain

It's been a painful year, one of global misery and shattered hope that will leave a scar on society. Every aspect of our civilization was tested to an extreme by both the virus countermeasures and the hit our already dying economy inevitably took. This all unfolding as the world's leading nations duke it out in yet another invisible war and the opportunistic mega-corporations slowly tighten their grip in the last remnants of personal freedom left.

The future that awaits, though still shrouded by uncertainty, paints an even worse scenario. One where the current system can no longer uphold all the accumulated negligence and stupidty we've been responsible for and begins to fail when it is neeeded the most.


SkeletonGL v2.0 has been released. The first stable and feature-complete build of the library, it has slowly but surely grown into exactly what it was envisioned to be and then some. The engine is now apt for more than just hobby projects by merit of it's stability, performance and simplicity, it offers enough rendering capabilities to give the established, C++ 2D rendering choices like allegro, SFML & SDL2 some competition.

This latest version has seen major changes to almost every section of the source to accomodate for some of the more elaborate additions but the interface has remained practically the same. Updating an SGL project to v2.0 should be almost as easy as pulling from the git and recompiling, however, the engine internals have been overhauled so if you're build relies on custom changes I'd advice to take a look at SGL_DataStructures.hpp which now contains all the internal OpenGL resources names.


Internally managed OpenGL resources

namespace SGL_OGL_CONSTANTS
{
    // GL_LINES uses this value top set the line's width, note that if AA is enabled it limits the linw width
    // support to 1.0f
    const float MAX_LINE_WIDTH = 20.0f;
    const float MIN_LINE_WIDTH = 1.0f;

    const float MAX_PIXEL_SIZE = 20.0f;
    const float MIN_PIXEL_SIZE = 1.0f;

    const float MAX_CIRCLE_WIDTH = 1.0f;
    const float MIN_CIRCLE_WIDTH = 0.01f;

    // These rendering consatants are the maximum amount of simultaneous instances to be rendered in a batch
    // and MUST NOT BE EXCEEDED
    const std::uint32_t MAX_SPRITE_BATCH_INSTANCES = 10000;
    const std::uint32_t MAX_PIXEL_BATCH_INSTANCES = 10000;
    const std::uint32_t MAX_LINE_BATCH_INSTANCES = 10000;

    // Names assigned to the OpenGL objects used by the SGL_Renderer
    const std::string SGL_RENDERER_PIXEL_VAO                  = "SGL_Renderer_pixel_VAO";
    const std::string SGL_RENDERER_PIXEL_VBO                  = "SGL_Renderer_pixel_VBO";
    const std::string SGL_RENDERER_PIXEL_BATCH_INSTANCES_VBO  = "SGL_Renderer_pixel_batch_instances_VBO";
    const std::string SGL_RENDERER_PIXEL_BATCH_VAO            = "SGL_Renderer_pixel_batch_VAO";
    const std::string SGL_RENDERER_PIXEL_BATCH_VBO            = "SGL_Renderer_pixel_batch_VBO";
    const std::string SGL_RENDERER_LINE_VAO                   = "SGL_Renderer_line_VAO";
    const std::string SGL_RENDERER_LINE_VBO                   = "SGL_Renderer_line_VBO";
    const std::string SGL_RENDERER_LINE_BATCH_INSTANCES_VBO   = "SGL_Renderer_line_batch_instances_VBO";
    const std::string SGL_RENDERER_LINE_BATCH_VAO             = "SGL_Renderer_line_batch_VAO";
    const std::string SGL_RENDERER_LINE_BATCH_VBO             = "SGL_Renderer_line_batch_VBO";
    const std::string SGL_RENDERER_SPRITE_VAO                 = "SGL_Renderer_sprite_VAO";
    const std::string SGL_RENDERER_SPRITE_VBO                 = "SGL_Renderer_sprite_VBO";
    const std::string SGL_RENDERER_SPRITE_BATCH_INSTANCES_VBO = "SGL_Renderer_sprite_batch_instances_VBO";
    const std::string SGL_RENDERER_SPRITE_BATCH_VAO           = "SGL_Renderer_sprite_batch_VAO";
    const std::string SGL_RENDERER_SPRITE_BATCH_VBO           = "SGL_Renderer_sprite_batch_VBO";
    const std::string SGL_RENDERER_TEXT_VAO                   = "SGL_Renderer_text_VAO";
    const std::string SGL_RENDERER_TEXT_VBO                   = "SGL_Renderer_text_VBO";
    const std::string SGL_RENDERER_TEXTURE_UV_VBO             = "SGL_Renderer_texture_uv_VBO";

    // POST PROCESSOR EXCLUSIVE
    const std::string SGL_POSTPROCESSOR_PRIMARY_FBO    = "SGL_PostProcessor_primary_FBO";
    const std::string SGL_POSTPROCESSOR_SECONDARY_FBO  = "SGL_PostProcessor_secondary_FBO";
    const std::string SGL_POSTPROCESSOR_TEXTURE_UV_VBO = "SGL_PostProcessor_UV_VBO";
    const std::string SGL_POSTPROCESSOR_VAO            = "SGL_PostProcessor_VAO";
    const std::string SGL_POSTPROCESSOR_VBO            = "SGL_PostProcessor_VBO";

    // Default shader uniform names, make sure they match your custom shaders.
    const std::string SHADER_UNIFORM_V4F_COLOR                  = "color";
    const std::string SHADER_UNIFORM_F_DELTA_TIME               = "deltaTime";
    const std::string SHADER_UNIFORM_F_TIME_ELAPSED             = "timeElapsed";
    const std::string SHADER_UNIFORM_V2F_WINDOW_DIMENSIONS      = "windowDimensions";
    const std::string SHADER_UNIFORM_M4F_MODEL                  = "model";
    const std::string SHADER_UNIFORM_M4F_PROJECTION             = "projection";
    const std::string SHADER_UNIFORM_F_CIRCLE_BORDER_WIDTH      = "circleBorder";

    // POST PROCESSOR EXCLUSIVE
    const std::string SHADER_UNIFORM_I_SCENE                    = "scene";
    const std::string SHADER_UNIFORM_V2F_FBO_TEXTURE_DIMENSIONS = "fboTextureDimensions";
    const std::string SHADER_UNIFORM_V2F_MOUSE_POSITION         = "mousePosition";

};



Moving on to new features, the SGL_Renderer can now render GPU accelerated circles. Drawing circles in modern OpenGL is rather complicated since there is no native OpenGL function to do so, they must be manually computed and rendered using the available primitive types which can be unnecessarily costly. After some experimentation it became apparent that the fastest way to render circles is to form a square by joining two mirrored triangles and using the surface as a canvas to render the circle on with a special fragment shader. Basically, circles are just sprites using a shader that renders a circle on top.

To complement the other primitive renderers, however, the new SGL_Circle object is a straightforward representation of a circle and can be rendererd calling the renderCircle function, asbtracting away the internal SGL_Sprite. Conversely, it's possible to draw a circle on top of a sprite by specifying the SGL_Sprite shader as a circle shader, the circle's width is parsed as a renderDetails.circleBorder.

struct SGL_Circle
{
    glm::vec2 position;                          ///< Circle position
    SGL_Color color;                             ///< Circle color
    SGL_Shader shader;                           ///< Shader to process the circle (because why the fuck not)
    float radius;                                ///< Circle size
    BLENDING_TYPE blending;                      ///< Blending type
};

// Added to SGL_Renderer
void renderCircle(float x, float y, float radius, float width, SGL_Color color);
void renderCircle(const SGL_Circle &circle) const; // Circles are just invisible sprites used as canvas



Rendering a batch of circles is as easy as calling renderSpriteBatch() with an SGL_Sprite that has been assigned either a custom circle shader or the included SGL::DEFAULT_CIRCLE_BATCH_SHADER (which is in reality a modified SGL::DEFAULT_SPRITE_BATCH_SHADER).

Circle batch example

    SGL_Sprite avatar;
    avatar.texture = _upWindowManager->assetManager->getTexture("avatar");
    avatar.shader = _upWindowManager->assetManager->getShader(SGL::DEFAULT_CIRCLE_BATCH_SHADER);
    avatar.shader.renderDetails.timeElapsed = _fts._timeElapsed/1000;
    avatar.shader.renderDetails.circleBorder = 0.06;
    avatar.position.x = 40;
    avatar.position.y = 10;
    avatar.size.x = 32;
    avatar.size.y = 32;
    avatar.color = SGL_Color(1.0f,1.0f,1.0f,1.0f);
    avatar.blending = BLENDING_TYPE::DEFAULT_RENDERING;
    avatar.resetUVCoords();

    // Note the mismatch in shader and render call, in this case it will default to rendering a sprite with
    // default settings
    _upWindowManager->renderer->renderSprite(avatar);

    // Generate 4000 sprites worth of model data
    std::vector sBatch;
    for (int i = 0; i < 4000; ++i)
    {
    // Prepare transformations
    glm::mat4 model(1.0f);
    float r2 = static_cast  (rand()) / (static_cast  (RAND_MAX / 3.12f));

    model = glm::translate(model, glm::vec3(rand() % 320, rand() % 180, 0.0f)); //move
    // rotate
    model = glm::translate(model, glm::vec3(avatar.rotationOrigin.x, avatar.rotationOrigin.y, 0.0f));
    model = glm::rotate(model, r2, glm::vec3(0.0f, 0.0f, 1.0f));
    model = glm::translate(model, glm::vec3(-avatar.rotationOrigin.x, -avatar.rotationOrigin.y, 0.0f));
    // scale
    model = glm::scale(model, glm::vec3(avatar.size, 1.0f)); //scale
    sBatch.push_back(std::move(model));
    }

    // In this case the sprite batch renderer matches with the CIRCLE_BATCH_SHADER and will draw a circle
    // on top of the sprite each instance render
    _upWindowManager->renderer->renderSpriteBatch(avatar, &sBatch);


GPU accelerated primitives (including circles!)

Pixel, Line, Circle & Sprite batch rendering test

This flexibility allows circles to be rendered on top of any sprite and for the circle's border width to be specified, all in a single draw call.

Behind the scenes, circles are just custom shaders and can be applied to any sprite


It's also possible to fill the circle by simply specifying a bigger border width value.

Same circle, different border widths


To better showcase what the library is capable of, the original plan was to release the v.20 update alongside an arcade game called Risk Vector. However, time constraints and the 2020 global fuckery in general left me with little time to develop the game. Opting instead to polish SkeletonGL as much as possible to then upgrade the software already using it and only then moving on to develop some games.

The few client projects using SGL as a means to render graphics have already been updated (check your email for notifications) and both CAS-SGL and Snake-SGL will hopefully soon follow.

[#87] [19/11/2020][misc]
// Exposed to the Sun

The weather is slowly but notably shifting into the lower 20Cs after yet another year of arid hellscapes.

If only there were more natural obstacles between the sun and the streets...


This second half of the year feels twice as long as the first. There are so many bizarre situations simultaneously unfolding it makes the past decade look insignificant in comparison.
So much has happened in the past ten months alone that the precedent for the foreseeable future is clearly an even bigger struggle. As every corner of the planet is busy dealing with its own mayhem, we as individuals are being economically, emotionally and intellectually taxed to extremes previously reserved for the horrors of total war. Victims to a silent yet meticulously planned offensive that keeps us artificially preoccupied and ultimately distracted from what truly matters.

The collective stress is palpable even from a distance. Expressions full of exhaustion are now the norm even for those few that had remained unscathed thus far, now resigned to accept their newfound status as yet another prisoner to the system's jailer with the rest of their worth. Such is the fate of all members of a society that still requires human intervention to perform at essentially every level. Perhaps when this is no longer the case, we won't even be necessary anymore.

Carl Jung proposed a rather novel theory to explain the mystery behind the seemingly autonomous behavior of complex lifeforms, where all members of the same species share a cognitive link among their own kin through a metaphysical entity he termed the "collective unconscious". A universal and impersonal element of our being that is inherited and beyond our control, acting as an abstract mediator between our ancestor's experiences and our immediate psyche, a repository of our predecessors ideas that manifests itself through cryptic means, like creative thoughts and dreams.

Needless to say, such a model lies beyond the scope of our current understanding of the nature of reality and well withing the realm of the metaphysical, it even shares many similarities with Plato's "Realm of Ideas". A similar concept that attributes our perceived reality as a projection of a higher dimensional plane from which all ideas originate. The resulting universe is, therefore, an imperfect projection of a perfect idea, host to incomplete shadows or "forms" that are but abstractions of a core, fundamental concept. Thus, complex life can be understood as an incomplete permutation of a greater, encompassing idea that keeps an unconscious, bidirectional connection with its own projection.


Who we are lies betwixt our cognitive genetic legacy and the arbiter of our own original actions we define as free will. The metaphorical blank check that new consciousnesses are popularly compared to is, therefore, inaccurate in that it fails to account for the inherited psychological defaults, the archetype that from that point on, will wrestle with the conscious self into a predestined path of sorts. A Hegelian dialectic of the experiences we assimilate as unique individuals and the accumulated lives of our species.

SkeletonGL was recently used as a starting point for a client's data visualization software (closed source at their request, unforutantely) that ended up serving as a catalyst for a few changes.

The biggest ones being:

■ Polished the instance renderers, they should work much faster now
■ Added a limit of 10,000 items per call, see SGL_DataStructures.hpp to change it
■ Added blending options to the SGL_Pixel, SGL_Line and their respective batch renderers
■ Added new blending functions
■ All instance renderers are now compatible with their base shaders (ie, sprite shaders work with sprite batching)
■ The getTextureMemoryGPU function now includes the instance renderers pre-allocated memory
■ Added a makefile / compile time macro to add or remove error checking after render calls
■ SGL_Pixel can now specify pixel size
■ SGL_Line can now specify line width
■ Optimized away many unnecessary calls to OpenGL VBOs


Testing the instrance renderers (FPS lowered due to compression)
The library is shaping up very nicely, the performance boosts alone allows for some truly crazy shit. This will most likely be the final release before 2.0 and Risk Vector are done.
[#83] [23/08/2020][misc]
// Sunday mornings

The yearly heat weave, though unforgiving as ever, was at least two or three degrees lower that the previous one.

Some believe this preceeds a particularly cold winter, I hope they're right.

Statistics show that depression has crept its way into every corner of the world. No nation, developed or otherwise, has managed to counter its now global presence. A side effect of the relentless barrage of psychological warfare we are subject to, slowly altering our behavior to best adapt to the morphing culture and its unforgiving stance on those that refuse to change.

This is happening to many others that find themselves in similar or worse positions, a distinction that is quickly blurring into an homogeneous rank for anyone and everyone below the upper echelons of the socioeconomic ladder. Where the frustrated masses find themselves piled up in the same, miserable category as expendable peons in the midst of a metaphysical conflict beyond their collective comprehension. For the techniques employed to steer minds into a societal union no longer rely on honorable actions or even the promise of wealth and pleasure. These new, scientifically backed methods operate on a philosophy too complex and unintuitive to grasp by most. Such is the level of refinement and efficacy that the results can be nigh unfathomable and well within what some would describe as (evil)magic.

Perhaps this has always been the case, human history is proof of the sacrifices required for change as well as the fact that, despite the times, the pattern remains true. Civilization, being as complex as it is, cannot rearrange itself without temporary loosing its momentum lest it destroys itself attempting to steer. In the same way a car must slow itself down to an appropriate speed to safely make a pronounced turn, so must the human collective take the necessary precautions to survive changing its direction. As of the last twenty years, it has become clear that we aren't being led to where we were promised, even worse it may already be far too late to compensate for the incoming curve.

Pocket_php's session manager has been updated to handle both inactivity timeouts and login expiration independently from the php-fpm daemon configuration, this enables projects with different session configurations to be simultaneously run on the same web server. The demo website was also updated. Updating to 1.3.

The new pocket_php demo site

Being first hand witness to just how fast nature attempts to reclaim its former land has so far been the clearest sign of how pronounced the halt in human activity has been.

Flocks of birds became a daily sight, with places like roofed parking lots and public parks hosting so many nests it became quite common to spot egg shells on the sidewalk. By the end of the second month, cats and possums had bred into armies fighting for territory with the casualties attracting scavenger species that knew better than getting close to the city, specially deep enough into the human hive for me to see. Native insects that were believed to be long gone (like mantises and tarantulas) could be found in the denser patches of grass. Lizards decimated the ever present roaches, rats countered this by feeding on the surplus of lizards which in turn allowed the bigger predators to thrive on the already problematic amount of rats. At its apex, leaving trash bags out in the open meant they were going to be undoubtedly vandalized by the swarms of critters that by then had already adopted a basic hierarchy and thus, focused their attention on looting our precious garbage.



Cats in particular benefited most from the bizarre situation, with birds and smaller critters growing in number and an almost total lack of human intervention they opportunistically multiplied. The resulting generation, having grown and adapted in an environment of abundance that required relatively little effort to survive and by learning from their parent's warped example, failed to inherit the knowledge a street cat requires to survive, setting them free from the intellectual and memetic bonds that (for better or worse) enabled their predecessors to make it in the concrete jungle. Including their distrust for humans.

Funny enough, as economic activity rose to "normal" levels and the inevitable culling began, the newest generation that was born into staleness and luxury was the first to die.

What happens when an individual arrives at the conclusion (correctly or not) that the current state of affairs that directly affects his life is beyond the influence of even the most realistically capable version of himself? We have long outgrown the memetic values of old that, up until a few decades ago served as the foundations of our (actual) social network, they made up the collectively accepted yet largely unmentioned moral contract that encouraged strangers within the same living space to bond despite their relative lack of immediate kinship. This of course, was only possible thanks to our cultural achievements that have elevated even the most unfortunate of souls from the dirt of nature and into the uncertian freedom of their own will.

This shift also warped the dynamics of ours daily lives to the extremes we know and love today. Self destructive vanities like sedentary lifestyles and stupid, counterproductive diets were once reserved for the very most privileged prior to the industrial revolution. And such people were looked at as what they were, opportunistic hedonists that were only pampered by virtue of their birth or coin. Abusing their often unearned positions to perpetuate the degenerative pleasures that gave purpose to their otherwise seemingly pointless existence. As motivated to pursue further distractions as they may have been, the fact of the matter is that the technological disparity between the aforementioned characters and the modern man is so massive that they can't possibly be equally categorized.

Even history's most infamously selfish and misguided individuals, with all their power and wealth, could only muster so much indecency before the limits imposed by their technology and manpower forced their egos into a plateau. Today, this metaphorical obstacle is no more, for its (far) easier to indulge in self abuse in the age of scientific miracles and normalized misery. Who, after all, is willing to push back against what is for most, the only escape from their respective dystopian prison? Said immediate dopamine rush inherent to modern distractions greatly out benefits the slow yet wholesome rewards of discipline, a feat only possible by abusing our newfound technological powers. The resulting artificial environment can be so particularly oppressive because its countless faults are being compensated for by computers and robots from every angle. Being nothing short of ideal slaves, tireless and eager to gain purpose by fulfilling the activities that kept our ancestors busy and relevant, directly promoting the societal role of the average man into that of a conqueror of nature that answers only to his own judgement.



PHP's native cookie management is already as simple as it can possibly be, wrapping its minimalism around a mere rebranded interface would only bloat the codebase. Though I have yet to encounter the need to modify said cookies policy, the encapsulation of the cookies themselves and off the PHP global scope fits the overall design better, the HTTPRequest class now has a container for all of the client's cookies.

Updated to 1.2.

Quarantined at home, I've been using the extra time to work on the next SGL example project and the SGL engine itself. The game will cover a lot of ground for future projects as well as providing a more thorough showcase of SGL capabilities and features. Including ones not present in the previous SGL game like:

  • ■ Game state management & responsive UI
  • ■ Sprite batching & texture atlas
  • ■ Custom shaders and post processing effects
  • ■ Both TTF and bitmap font rendering
  • ■ Updates to the real time debugging panel & production build settings
  • ■ Support for both VSYNC & unlocked framerates


Fortunately enough, the amount of internal changes to accommodate for most updates has been minimal and will most likely stay this way until version 2.0 releases with the finished game. For now, all dependencies have been updated to their latest standards and tested to ensure compatibility, their respective credits and licenses are listed in deps/deps_credits.txt.

Note that if you're using GLM for your own project's math (and you probably should) starting with version 0.9.9.0, matrices and vectors are no longer initialized by default. Meaning that all glm::vec & glm::mat must be manually initialized to 1.0f to imitate its previous behavior.

Updated to 1.4.

The pointless quarantine keeps claiming jobs, today I learned of two more layoffs among my contacts before midday. Just a few days ago a client of mine, who accounted for almost half of my current income had to close up shop as well, halting the development of their project for the foreseeable future. Taking yet another financial hit, many plans had to be reassessed to compensate for the now critical budget, hence the repeated delays for steps that would otherwise be long behind. I can only imagine what those less fortunate and with heavier responsibilities are going through, its been long enough for the average home to have burned through their savings, assuming they had any to begin with.

Not a single aspect of what used to be our daily life remains intact, the already strained bonds among the few fortunate to have them are being warped into the same, meaningless social contracts that so many of our most basic cognitive needs have been demoted into. Everything beyond our economy has been taken for granted to the absolute extreme, nurturing the underlying misery that invariably accompanies sentience. Explaining why the ranks of the indifferent keep growing.

We are living during the age of technological prosperity that up until two generations ago, was nothing but the maddest of fiction. What could only be described as literal magic is today a mere recipe, we've triumphed beyond the hardships of our planet and into the lethal void, we've mastered the secrets of life to serve our endless demands. Should we survive a few more generations I have no doubt we will be capable of feats beyond our current imagination. All this, paid for by our humanity.

Perhaps it wasn't worth it.



SkeletonGL is currently being developed alongside a more elaborate showcase of its capabilities and features than its previous example program, Snake-SGL. The idea is to balance the relationship between the underlying library and the user's application so that SGL provides only what's essential for a high performance rendering engine without forcing any particular design upon the programmer. Features like FPS control and networking thus lie outside the scope of what SkeletonGL was made to do; render shit to the screen. To accomplish this in a predictable manner, however, the system needs to manage the logic's internal refresh rate.

This can be a far more complex problem to solve than it may initially appear. Many of the solutions that are often tried first fail due to subtle miscalculations like floating point inaccuracy when working out the frame's delta time or the inherent unreliability of sleeping a thread for longer than a system specific amount of time (~1ms). But the real headaches are the variables outside the programmer's control.

What happens if the user opens up 40 Chrome tabs while simultaneously watching a movie and playing your game, creating a resource consumption spike that lags the reactivation of the game's sleeping thread? What if the user has a monitor with a refresh rate different than that of your application and activates VSYNC (no, you can't stop them)? What if the user has a machine so powerful it processes the game at 1000Hz and his monitor at 255Hz, far higher than your game's update loop of 60Hz? What if the host OS is running under some sort of compositor and messing with the screen's refresh rate? What if the user is running the game on a toaster and can't even reach let alone maintain the minimal required FPS? What if despite this the user still activates VSYNC?

I'll write a proper explanation and tutorial on a separate entry, suffice to say, SkeletonGL now only accounts for its own delta time and no longer offers FPS control settings, time management has been moved from the SGL library to the application's source. Updated to ver 1.3.

The changes instilled by the COVID-19 panic have reached heights previously reserved for fiction. Homelessness is sweeping through the nation leaving nothing but misery on its way, serving as a grim reminder for the few that remain active in the workplace of the overwhelming chaos trailing ever closer. To those who were already doomed to sleeping in the streets, however, the potential dangers of the virus haven't been as destructive as the loss of empathy from the rest of the population to their panhandling ways.

By lying somewhere between sleeping on the sidewalk the loss of their income, the desperation among those at the very bottom of the socioeconomic spectrum has pushed them deeper into survival mode were the imperative to live overcomes any preexisting moral allignment. The air reeks of imminent violence, the fear in the eyes of the displaced can't be assessed as an understandable reaction to an unfortunate situation or an opportunistic glimpse for some much needed relief at a random stranger's expense. Though the quarantine has helped curve such crimes it failed to address all and any preexisting societal problems, most of which stem from the fact an increasing percentage of the citizenry have nothing else to lose but their dreaded lives.

In a society as fractured as ours, the gravest danger remains society itself. Even as the (supposedly) ultimate killer thins the herd into patches, even as the economy implodes and food shortages worsen, it were the generations of negligence and indifference towards our own system that ultimately killed hope.