Widget rendering

In the last post I wrote down a first draft for the specification for widgets. All nice and fine, but not really pretty. Pretty ugly actually, so there’s tons of room for improvement. And there was not much on rendering. So this week I decided to work on … *drum roll* widget rendering! So here’s how it looks like now:


… and the extra config bits are the “MainRenderer” and “MarginRenderer”


So what’s different? First, instead of a constant color, we now specify a renderer for margins, and a renderer for the main body. These renderers can be specialized for constant colors, textures, animated textures, and whatever we want. In the examples above, the margins renderer uses a closed door tile from a tilemap for the four margin corners, and a swamp dragon (from the DCSS tilemap) tiled along the sides, rotated properly (my definition of properly, that is). I’ve intentionally set the margins to be different sizes, to demonstrated that the textures will scale in order to fit properly with minimal distortion.


The texture rects don’t have to be square, and that’s demonstrated with the other margins, where I use this texture “atlas” composed of two elements: a square corner (highlighted above) and a rectangular “side” tile. Again, the renderer automatically calculates how many times it can fit the texture rect on a margin side with minimal distortion, and maps it appropriately.

The renderers are completely optional of course: no margin renderer specified -> no margins rendered. Also, if margin is 0 on the left, no margin will be rendered on the left (but the rest will).

Resource loading

To make the above happen, I also had to implement some general resource database for the application, storing currently textures, texture atlases and shader effects. The texture atlases are .json files that reference a texture and store a mapping of (name, rectangle). For example, the RGB-lines atlas file is the following:


Shader Rects and useless (for now) optimisations

The game is grid-based. The grids will never be humongous. Therefore, I decided early on, 16 bits are enough to represent a grid cell in the overworld or the dungeon. Hell, even a 16-bit signed integer would do (32768), and on the plus side, negative values allow me to quickly check if a coordinate is invalid. So, I’m using a pair of 16-bit integers as tile coordinates for most grid operations in the engine, with pretty much zero chance for overflow.

Now, with widget rendering, I realized I’ll frequently need to pass rotated rectangles in the shader. Easy way: make a 2d rotation matrix and pass it to the shader; problem solved. But hey, that’s 4 32-bit floats (matrix) in addition to 4 16-bit integers (rect). That’s a bit overkill, no? Well, it is, as suddenly the data footprint grew by 200%, for typically 3 possible rotations:  90, 180 and 270 degrees. Is there a way to express it better? Most likely.

I know I have a sign bits at the end of each of the 4 16-bit integers for a rect which are pretty much unused (WARNING: There might be a scenario where a rect “enters” the viewport from negative coordinates, so I need to keep an eye on that). So, that’s 4 bits to pack stuff in! 2 bits can be used to represent the 0,90,180,270 rotations, and two bits can be used to represent flips (horz/vert). That’s it! I actually implemented it, so that’s what I’m going to be using for the vast majority of tile rendering, as well as GUI margins.

Next to tackle is input handling for widgets, and perhaps if there’s time a tilegrid widget, to view for example a few tilesets with a GUI.

Widget specification – Part 1

Originally, as a next step in (documented) development, I wanted to deal with the various application states (main menu, overworld-idle, targetting, etc) but I realized that the application states, in my mind at least, have a very strong link with … widgets! Textboxes, tile grids, containers, minimaps, that sort of thing. Therefore, I thought it will be prudent to nail that down first, as it’s a lower-level system. Here is a first stab at a specification, with a working example.

Widgets are objects responsible for rendering and input handling.  Widget examples:

  • Container1D: Container of other widgets, laid out horizontally or vertically (listed top to bottom).
  • Container2D: Container of other widgets, laid out in a grid.
  • TextBox: A rectangular area of text. Used for labels.
  • ListBox: The typical vertical arrangement of e.g. 1. “Blah” 2. “Bloop” of which the user selects one
  • Button: A button with optional text. Can be implemented as a special textbox,
  • TileGrid: View in a a 2D grid of cells, that are highlightable and selectable
  • PixelGrid: For minimaps and zoomed out overworld something something


Here is a current example. I have a vertical Container1D with 4 elements: a textbox (with some sort of rich text format), a simple box (it’s a dummy single color widget), a horizontal Container1D with three simple boxes inside, and a simple box with margins (left 1px, bot 2px, right 3px, top 4px). All that is driven via the JSON text below.




  • Widgets are highly hierarchical: A container can contain containers and so on.
  • A widget can be modal or not.
    • In-focus modal widgets don’t allow any other widget to get focus/input, modal or otherwise
    • Each widget can spawn a maximum of one modal widget
    • Modal, in-focus widgets are rendered last. So, for example, we can darken everything else a bit before overlaying the widget, to communicate the modality.
  • Widgets can spawn an arbitrary number of floating widgets (floaters). This for example could be damage numbers above creatures when hit.
    • Non-modal floaters don’t handle input or be considered for focus
    • Use-case: Damage number (non-modal)
      • when activated, we specify the tile location that it originates from
      • when updating, reduce transparency and increase .y slightly
      • no input handling
  • Container widget never overlap each other. Only floaters can overlap


  • Widgets are not resizable. Too much hassle to make sure it always looks nice. Presets will have to be created for each supported resolution, or at least for each aspect ratio.
  • Widgets need to provide intersection testing against the mouse pointer. Simplest way using a bounding rectangle.
  • Widgets control their dimensions
  • Widgets do not control their location: their parents do!

Input model

  • How does the GUI system get input? The application passes the input events to the active root widget. The widget handles input and passes the events to children in case the input was not handled.
  • Input handling functions for: mouse wheel, mouse pointer, mouse buttons and keyboard. Later on could also add controller.
  • Widgets store a vector of (key, command) pairs for handling key presses. When populating that vector, always assert that a key is only used once.
  • Some widgets have constant bindings. For example, some modal dialogs bind Esc for back/cancel, while listboxes bind 1,2,3,etc for option index. Similarly,  arrow keys are used for listbox option navigation and tilegrid tile selection
  • Order of handling input:
    • The modal+in-focus widget, if one exists.
    • Otherwise: root widget and non-floater children, recursively.
      • Effectively, depth-first handling.
  • Widgets input interaction with current running state (decoupled)
    • Example: TileGrid widget, SelectTile state
      • widget::OnMouseMotion: emits event TileHighlighted
      • widget::OnMouseButton:
        • LMB: emits event TileActionMain
        • RMB: emits event TileActionSecondary
      • widget::OnKeyboard:
        • Arrow keys emit TileHighlighted.
        • Enter emits TileActionMain
      • The state listens to the events and processes them accordingly
      • The state adjusts the TileGrid widget rendering configuration so that it renders some flashing transparent dark  tile over the areas that can be selected (e.g. within attack range)

Rendering model

  • Use pixel coordinates, no normalized (0,1) space
  • Bottom-left is (0,0)
  • Use a selection of (flyweight) renderers: colored rect, textured rect, font, etc
    • Widgets own the configuration for the renderers, not the renderers themselves
  • Basic widget rendering involves:
    • Rendering its own area using simple color, simple texture, or a fancy shader
    • Rendering its margins (if any)  using simple color, simple texture, or a fancy shader
  • The widget hierarchy has an effect in rendering order.
  • Simple rendering method: render everything on the fly, every frame
  • Advanced rendering method (later on): Instead of rendering, just schedule parameterized rendering commands, and let the renderer do the batching/ordering.
    • z depth would be related to the hierarchy level: Children are always rendered in front of parents
  • The hierarchical widget model can be used for caching rendering results.  (later!)
    • A sidebar full of textboxes, lifebars and whatnot, will only be rendered when one of its elements changed. So, no font rendering all the time
    • TileGrid will be rendered all the time
  • Separate rendertargets may be employed for rendering some widgets.
    • Floating damage textboxes! they are pre-rendered in their own rendertarget, and every frame just blit some rects onto the appropriate location on the tilegrid


  • Have presets for each resolution/aspect ratio
  • Widgets can be initialized but not activated.
  • Only activated widgets are used in the game
  • Maintain a globally accessible mapping of widget names to widgets. When a state needs to adjust a widget, use the mapping to get access to it (widgets are not owned by states,  but they are used by them)
  • Inactive widgets can be used as prototypes: When I need to create a widget, lookup an already created one (using a certain configuration) and clone it.


… And that’s it for now. Next time, I’ll have some more widget types implemented (as listed above), with some fancier rendering and input handling. The first test case will be a main menu and the overworld screen, so when it’s time for the application states, I can connect these two.


I finally finished with the implementation of HPA*.

The pathfinder can be parameterized on cluster size, cluster-local path quality, coarse path quality and minimum distance between entrances to name a few. Below are a few examples. HPA* is ideal for use for many unit pathfinding queries, which I intend to have in AoT.

Clusters and entrances. Background is the movecost heatmap (blue: low move cost, red: high movecost)


Clusters, entrances and connections


Low quality path using HPA* (cost: 70.464)


High quality path using HPA* (cost: 25.349)


High quality path using regular A* (cost: 23.765)



Dynamic path

The pathfinder works also with dynamic weights: when parts of the weights are updated, we only recalculate the cluster-local paths in the clusters that the weights are in, ie. minimizing A* calculations.

Here’s a busy visualization of a dynamic path:


The movecost map is visualized again as a heatmap. Green is the “unknown movecost”, and is the so-far invisible area. The map is split in 8×8 clusters. All entrances and connected paths are visualized with magenta. The fat white line is the evolving path calculation. It is clear that “highways” are formed naturally when we explore the movecost map, as several paths coalesce