let it leave me like a long breath

let it dissipate or fade in the background

Entries tagged with gamedev challenge

Profile

xax: purple-orange {11/3 knotwork star, pointed down (Default)
howling howling howling

Nav

  • Recent Entries
  • Archive
  • Reading
  • Tags
  • Memories
  • Profile

Tags

  • art - 2 uses
  • asteroid garden - 4 uses
  • code - 19 uses
  • demos - 1 use
  • dreams - 5 uses
  • ff7 fangame - 23 uses
  • fic prompts - 13 uses
  • gamedev challenge - 82 uses
  • hell game - 76 uses
  • nanowrimo - 11 uses
  • plants - 9 uses
  • process - 52 uses
  • programming - 51 uses
  • screenshots - 5 uses
  • writing log - 83 uses

May 2025

S M T W T F S
    123
45678 910
1112131415 1617
18192021222324
25262728293031
    • Previous 20
    • |
    • Next 20
  • Jun. 23rd, 2021
  • xax: purple-orange {11/3 knotwork star, pointed down (Default)
    Tags:
    • gamedev challenge
    posted @ 11:01 am

    so i haven't really been posting here lately. between covid and the move across the US (and the housing situation before that that lead to the move across the US in the middle of covid) any schedule i ever had for working on things kind of went out the window.

    stuff i've done recently...

    well, somebody in one of the coding discords i'm in asked about using rhombic dodecahedral cells to make a minecraft-style game, and as one of the handful of people in the entire world who's used the rhombic dodecahedral honeycomb (apparently) i gave them some pointers and then decided to write up a bunch of information about the data structures i use. i'll probably update the site mainpage to include that under 'math writeups' at some time.

    a really long time ago, when i was first starting out using twine, i put together a weird farming sim game, just as a test to see if it was possible to actually do it in twine. (this is using twine 1.3, which means it was made pre-2014 i think?) at one point i tried redoing it in a more ambitious fashion in twine 1.4, but i never really got it 'finished'. anyway i've been getting incredibly frustrated with the glacial pace of all my current game projects, and so i've been really struggling to think of a game idea that's not a giant sprawling mess of planned content, and so in two days i threw together this, which is a partial clone of the first twine version.

    yesterday i put together this thing, which isn't a playable game of any stripe (yet), but i have some plans for turning it into an actual game at some point (see source for details since i didn't erase those before uploading it :V). i plan on at the very least 'finishing' the farm game up to the point where it's a full clone of the twine game, & maybe uploading it to itch.io or something, idk

    anyway i keep thinking i want to get back onto a two-week-project schedule, since i really enjoyed that and got a lot of stuff done with it, so maybe i'll try to restart that going forwards. it would be nice to try and make some actual two-week games, instead of having all of the code be internal library code for giant projects, but who knows.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Feb. 16th, 2021
  • xax: purple-orange {11/3 knotwork star, pointed down (Default)
    Tags:
    • gamedev challenge
    posted @ 10:49 am

    l-systems & heightmaps

    okay so two week projects

    these have gotten a little more difficult to write since my mastodon instance vanished into nothingness so i don't have a collection of intermediate screenshots to share

    anyway uh january 14th-31st i worked on l-systems in javascript. a lot of the l-system implementations i've seen have been pretty simple ones that just allow for expansion rules, but the scientific literature on l-systems tends to use considerably more complex systems that have parameters or preconditions and postconditions or all sorts of other stuff. so implementing that in javascript in a way that wasn't a nightmare to write was a bit of an accomplishment.

    also, to render l-systems you generally need some kind of turtle geometry parser, and when you get into the more complex l-system actions, like generating 3d geometry, those can get pretty complex. i ended up hooking my turtle code up to the 3 svg renderer code i was working on before this, and while that still doesn't work right (i'm still not handling my matrix transforms right, so the bsp-cutting code doesn't work correctly) it's good enough to render some stuff in 3d and have it look 3d.


    february 1st-14th was kind of a mess -- i started out wanting to render houses for my hex game. i've been playing a lot of 'kenshi' recently and so i was thinking about its house mechanics -- everything is in the same global world-space, it's just that when you step inside a house it stops rendering the exterior of the house and starts rendering the inside, so you get a bit of a cross-section view. it's kind of a clever use of the fact that polygons are all one-sided in 3d rendering.

    so at first i was like, sure i'll just make a custom House datatype and use that for buildings and expand that as i add more building features. but then i was like... well, that would mean totally re-coding stuff like collision and rendering and texture tiling for buildings, since they wouldn't be tile-based like the world heightmap, and then probably again for whatever i'd be doing for caves or underground sections. also, kenshi has this issue where buildings can't punch through the world heightmap, so it's impossible for any buildings to have a basement.

    that got me thinking about other ways to handle things that wouldn't require quite so much custom code, and so i've tentatively settled on layered heightmaps -- basically instead of a single big heightmap for the entire world, the world has three heightmaps that might have gaps or holes in them -- one is 'the ground', the second is an inverted heightmap for the undersides of overhangs, and the third is a higher 'floor'. these are stitched together in the same way as tiles are normally stitched together, although to render good-looking transition tiles between different materials i am gonna need slightly more information. this means that the worldmap itself can support overhanging cliffs to a limited degree, without making me go full-voxel, which is something i've always wanted to avoid for this project.

    it also means that for constructions i could just do something like, well, this building has its own chunk data, only it has like seven layers of heightmap rather than three. then buildings would use all the same rendering, collision, and tiling code as the worldmap, which would make coding them up a lot simpler. (this is inspired by the minecraft metaworlds mod, which solved the issues of custom airship mods not supporting modded blocks by just saying 'actually airships will just use chunks out near the world boundary at 31 million blocks away from spawn to store airship data instead of normal world data, so we won't have to think about how to serialize it at all'. turns out if you already have a way to store world geometry data, it's a lot easier to just reuse that than code something else that's a totally custom implementation.)

    anyway that's not all fully done yet, and i'll have to add in a bunch more complex rendering stuff for things like 'walls that aren't a full tile thick', etc, but it's a decent start. all this chunk-generation stuff also got me thinking about one of the big remaining issues in the game, which is that it doesn't actually load chunks during the game -- you 'zone in' to a location and it loads a 6x6 grid of chunks around you, and then that's it; if you walk to the edge of the loaded zone first you walk around on invisible geometry and then you drop down to height 0 when you get to the absolute edge of loaded map data. getting map loading working would do a lot towards making this feel like an actual game.

    the big issue with that has been since opengl actions can only be run on one thread, i couldn't really 'multithread' it and run map-loading ops in the backgroud. (i could run map generation in a different thread, but ultimately rendering the loaded maps would have to happen in the main thread, since that involves buffer allocations and buffer writes that can only happen in the rendering thread.) so a while back i worked on coroutines, which are a way of doing thread-like computations in a single logical thread. i got that mostly working, but i never merged it into my main project. so i also did that! there was a bunch of state wrangling involved for that, and i'm still not done with it entirely, but i got it working well enough to add an initial loading screen for the game, for when you first load in any chunks. actual background chunk loading during the game will come in a bit, maybe even later today if i keep working at it, since i still have to do some data-management code there.

    anyway, that's some progress. i'm actually feeling kind of excited about this code stuff, since i've been working on 'engine' code for such a long time without any firm 'game concept' attached. but now the engine stuff is almost all done, and i'm starting to get into gameplay design finally. so that's neat. we'll see how that goes as i finish up more things; i've been kind of thinking of streaming me 'playing' the game and going over all the code stuff that went into it and what my plans for development actually are, but, well, we'll see where i'm at when all the engine stuff is done, and not just most of it.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Jan. 13th, 2021
  • xax: purple-orange {11/3 knotwork star, pointed down (Default)
    Tags:
    • gamedev challenge,
    • programming
    posted @ 12:31 pm

    okay it's been a while. uhhh so i moved across the country in october so that was exciting. also you know, everything else that's happened in the US since then has also been exciting.

    anyway i haven't really kept up on my two-week projects but i have done some stuff so i might as well post about it now.

    i didn't really do anything code-wise for october or november, since, moving and then the election had a way of making me real unable to focus on anything. early december i was gonna do procjam 2020 but i ended up not really getting much of anything done. i did some more stuff with svg 3d renders:

    • boxspin #1
    • boxspin #2
    • boxspin #3
    • boxspin #4
    • boxspin #5


    (this is a mostly-accurate implementation of a BSP-tree based polygon-cutting algorithm for running painter's algorithm to depth-sort 3d geometry.)

    but that never really got past the point of an early demo




    after that i mostly played a bunch of 'per aspera' and immediately got frustrated with how simplistic their terraforming model was so i was like "i'll make my own terraforming game"

    so i threw together some parts of that in the later half of december -- fixed up some of my polyhedra code slightly and spent a lot of time digging into GPipe to try to get framebuffer-based picking working. previously all my picking had been geometric: raycasting through a map grid and doing geometric calculations to see what was hit, and using that data to determine what if anything was under the mouse cursor.

    framebuffer-based picking is when you render the screen a second time, but with everything that can be picked (as in selected by the cursor) drawn a different color. then you can just look up what color a given pixel is in that framebuffer to see if there's anything to click on or hover over. this means no geometry aside from your usual camera transforms, but it does mean a more complex shader/rendering environment. GPipe actually had a framebuffer-reading glitch in its library code i had to fix before i could get things working. it took a while. but anyway i got it so you can properly pick across a planetoid that i'm rendering, which is a good first start for a planet-based terraforming game.



    then in the first half of january i started hooking up actual UI -- there's currently no real "game model", but you can put down an initial building ("seed factory") and then use a menu to put down other constructions. i'm going to need to totally restructure my layout code, since it really wasn't designed for flexibility or robustness, so even doing something like "make a horizontal menu" is a whole mess of code that could be a lot simpler.

    but i'm feeling kind of burnt out on that and i'm about to start working on a different 2wk project, so i figured i should probably write a post here since it's been a while.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Oct. 5th, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge,
    • programming
    posted @ 12:11 pm

    procedural skybox

    so, two week projects

    this one was actually a pretty big success! it's been hard to uhhh schedule anything these days, and going forward things are likely gonna be a mess for at least another month or two b/c i'm moving. but i wanted to get something nice and visual to show off, and one of the things i've been wanting to do for a while is shaders.

    so i made a skybox shader for my game!


    first let's maybe take a diversion into what a shader is, because people (well, ~gamers~) hear the word a lot but it's maybe not immediately obvious what exactly they are, aside from 'something to do with graphics'. you can skip this if you already know what a shader is.

    digital rendering is fundamentally about polygons, right? drawing triangles on the screen. that's a several-step process: placing some points in 3d space and connecting them together to make a triangle, figuring out what that triangle would look like on the 2d plane of the screen, and then filling in every visible pixel of that triangle with some appropriate color. these are all fairly tricky operations, and they're generally handled by the rendering library you're using.

    (this is a tangent, but for example, opengl has this property where edges are always seamless if their two endpoints are exactly the same, which is to say that the line equation they use to figure out which screen pixels are in a triangle will always be one side of the line or the other, with no gaps or overlaps, if you give it two triangles in the shape of a square with a diagonal line through it. they can seam if you generate geometry with 't-junctions', because then the endpoints aren't exactly the same-- this is something you have to keep in mind when generating geometry, & actually some of my map code does have seams because of the way i'm rendering edges.

    conversely, the old psx rendering architecture did not give this guarantee, which is why in a lot of psx games you can see the geometry flicker and seam apart with weird lines, which is where the 'is this pixel in this triangle or that one' test is failing for both triangles on what should be seamless geometry.)

    so this always involves a lot of really low-level math calculations -- when i say "figure out the position in 2d space", what that actually means is doing some matrix multiplication for every vertex to project it into screen space, and when i saw "fill in every visible pixel", that means doing a bunch of line scanning loops. these things were what the first GPUs were designed to optimize for.

    so, openly used to have what's called a fixed-function rendering pipeline: you basically had no control over what things happened during the rendering process, aside from some very basic data. for example, your 3d points came with a 3d vector attached that was its position. but you could also additionally attach another 3d point to each vector; this would be interpreted as a color: this gave you vertex-coloring. if you wanted to texture your triangles, instead of having them be a flat color, you could attach a 2d vector to each point; this would be interpreted as uv coordinates, or the place to sample from the currently-bound texture, and then it would multiply that sampled color by the vertex color to get the actual color. the final thing you could attach was another 3d vector, which would be interpreted as a vertex normal, and that would be used in the (fixed, hardcoded) opengl lighting calculations, to give you basic support for lighting techniques. there's a process in there that would look at the reference value for each pixel, and look at how they were positioned in space, and then interpolate between the values for each pixel. so this is how you got smooth shading, or working textures: because if you had one UV coordinate that was `0, 1` and another that was `1, 1` it would generate up some difference vectors and then while counting out each pixel it would provide that pixel with an interpolated UV, leading to different values (color, textures, lighting) getting sampled across each pixel. (actually iirc the opengl built-in lighting was gourad shading, not the more modern phong shading, and if you care about the difference you can look that up but this is already a really long shader explanation.)

    so what opengl supported, near the beginning was this: 3d points, colors, texture coords, lighting normals, and you could bind camera & perspective matrices, textures, and lights. those were the only options available, really.

    then opengl went up a few versions and decided to be more flexible. instead of all of those things being hardcoded into opengl itself, they decided, they would provide a programmable interface for people to write their own vertex and pixel code. this generalized everything here: instead of being able to send in only positions, or colors, or uvs, or normals, you could attach any kind of arbitrary information you wanted to a vertex, provided you could ultimately use it to output a position. and instead of pixels being rendered by applying these fixed transforms of vertex color times uv coordinate texture lookup sample times lighting angle, you could do anything you wanted with the per-pixel interpolated values, provided you could ultimately use it to output a final color.

    this blew things wide open. i remember back in the day when i had a computer old enough that it couldn't run shaders, and i couldn't run any newer games, and i was like, 'look shaders are all just fancy effects; i don't see why they can't just have a fallback mode'. and sure, if your shader is just a fancy blur effect you're probably still doing much the same kind of thing as the old opengl fixed pipeline: giving pixels colors based on base color and texture color and lighting level. but there's so much stuff that cannot possibly be replicated with the fixed-function pipeline. there's actually a really interesting talk by the thumper devs where they talk about how they make the track seamless by storing twist transforms in their shader code and applying them in their vertex shader, and doing most of their animations just by exposing a time value in their shaders (which is a very basic trick but it's one that was completely impossible in the old fixed pipeline).


    so i wanted to make a skybox shader, because so far i've just been drawing a flat blue void outside of world geometry. one of the things i've kinda been thinking about for a while is a game where you have to navigate by the stars at night (inspired of course by breath of fire 3's desert of death section), and that implies that you have 1. distinct stars and 2. an astronomically-accurate rotating skybox.

    so this is where the first bit of shader magic comes into it: i placed a quad at the far clipping plane, and then inverted my camera matrix so that i could push those values into world space and convert them into angular coordinates. by converting those to spherical coordinates, i could have a perfect spherical projection embedded into this flat quad, that would correctly respond to camera movements. keep that in mind for every screenshot i post about this: that this is a single flat quad, glued diorama-like to the far end of the camera; everything else is the projection math i'm doing per-pixel.

    so i queued up a bunch of posts on my screenshot tumblr; starting from here with the first attempts. (use the 'later' link).

    i wanted something stylized that matched the look of the rest of the game (so, aliased and triangular), so the first thing i wanted to figure out was how to draw a triangle onto the celestial sphere. this turned out to be pretty tricky, since shader code is pretty different from code you'd be writing elsewhere. like, if i wanted to draw a triangle polygon, i'd expect to calculate some points in space and then position lines between them. in a shader, you can't really do that -- or rather, it would be really inefficient. instead, you want an implicit equation, something that you can run in a fixed-form that gives you a value telling you how close the point is to the edge of the triangle, so that you can threshold that and get an actual triangle shape.

    i never actually got that working right, but i got it working kind of right and i figured, well, some distortion is fine.

    ultimately (as you can see if you page through all the posts) i got stars placed, constellation lines connected, the axis tilted, rotation happening, and even some very simple celestial object motion.

    the big flaws of this shader, currently, are that planets and moons are currently just flat circles. the major issue with that is that there are no moon phases, which looks weird as the moon advances relative to the sun in the plane of the ecliptic (the moon is up during the day when it's near a new moon, which irl we don't notice b/c the new moon is mostly invisible, but here it's still just a continual full moon). there's also no depth-testing the celestial bodies relative to each other; here i always draw the moon over the sun, and those are both always drawn over venus and mars, which are drawn over jupiter and saturn. so when planets are visible they're not actually properly occluding each other; when i was drawing things from in orbit around jupiter, i had to manually move jupiter up so that it would occlude the sun. there's also no 'eclipsing' code, which is an issue not just for solar eclipses, but also like, yeah on ganymede jupiter is continually eclipsing the sun, & between the 'moon phase' code and some 'eclipsing' code that should pretty radically change the look of the environment. but none of that is currently being handled at all.

    also i never did get the lighting angle correct -- at one point i got tired of the landscape being this fullbright shading regardless of the 'time of day' in the skybox, and so i pulled out some of the planetary calculations and just had the sun's position determine the light. this is another thing that's pretty hardcoded, right, since theoretically the moon should also be contributing light based on its phase, and if this is a jupiter-orbit situation where the sun is occluded then it should have its component cut out from the lighting calculation. not only am i not doing any of that, but i couldn't get the basic 'angle of the light in the sky' math right, so there's some weird displacement where the lighting doesn't really follow the sun all that accurately. i'm sure i'll figure it out at some point, and even just having any changing lighting looks great compared to what i had before.

    likewise i'm doing some 'day/night' calculations, but that's just slapping a blue filter over things and fading out the stars; i'd like to do something a little more in-depth.

    (also, since this game isn't gonna take place on earth i'd really like to add in some different stellar bodies -- more moons, different planets, that kind of thing. but that's its own project.)

    something else i discovered while working on this is that the gpipe shader code isn't very good. it aggressively inlines -- and i guess that makes sense because how could it not -- and that means that it generates hundreds or thousands of temporary variables. my shader code, which is fairly compact in haskell, undoubtably expands into thousands and thousands of lines of incoherent garbage in GLSL, which is an issue because that's the code that's actually getting run. somebody mentioned a vulkan library for shader generation which mentions the inlining issue, but i don't know if a similar technique can be ported back to gpipe, and gpipe itself isn't really being actively updated. there's no full replacement for gpipe around, but i guess if anybody does make a similar vulkan library for haskell i might be tempted to switch over, if only because the shader compilation is really bad. not unworkable, but probably i don't have an enormous amountof shader budget to work with, just because the transpilation into GLSL is so inefficient without some way to manage the inlining.

    anyway here are some references that i looked at while working on this:

    • the deth p sun art that was one of the original inspirations for this
    • a starfield shader tutorial that i used some techniques from, notably the dot-product noise function and also the layering (which is how i have big stars, small stars, and really small stars in the galactic band)
    • a shader tutorial that helped me w/ the line-drawing sections for the constellations
    • ofc i was doing all this on spherical geometry, not cartesian, so i needed to check wikipedia a bunch for how to do the equivalent coordinate operation: 'spherical coordinate system', common coordinate transformations, great-circle distance
    • some stackoverflow stuff: how to calculate distance from a point to a line segment on a sphere (this is used in the constellation lines, right, b/c the pixel is the point and the constellation line is the line segment)
    • i didn't want to go overboard w/ star simulation, but i did want some varying colors, so i looked at the blackbody radiation spectrum and eyeballed it to get a coloring function. i had this idea to mix in a second spectrum of 'fantasy' colors so that i might get green or purple stars or w/e too, but i never got around to that.
    • i wasn't actually sure if it was possible to write draw-a-polygon code in a shader until i saw this shader, which in drawing that triangle implements a general-purpose arbitrary-regular-polygon function. i couldn't copy it directly but it definitely helped guide me towards something that worked in gpipe + on a sphere
    • also a bunch of people in various discords and irc channels, b/c actually i'm not super great with math! when i was griping about the pinching at the poles due to lat/long, somebody offhandedly was like 'oh yeah try taking the sine of that' and it turned out that was exactly what i should do.

    all-in-all really proud of this! it's the first time i've done non-trivial shader work, and even though it still needs a lot of work it's a pretty good starting point. now the landscape looks way more like a landscape, and less like some geometry floating in a void!


    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Sep. 2nd, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 07:03 pm

    okay two-week projects

    i was planning on doing map generation stuff, but before i did that i wanted to get some infrastructure stuff working. previously, the game started and gave a new game / quit menu, and new game would generate a landscape and drop you into that and then hitting escape would immediately close the game. i kind of wanted some way to render the world graph, and i wanted to get more of an interface loop going on, and working on those things took up basically the entire two weeks. the interface i have now is a lot closer to that of an actual game's, though: from the 'title screen', the new game option loads up a setting form for how to generate the worldmap graph, and then it generates a (stub, debug-rendered) worldmap that you can move around and look at, and when you select a tile you can zone into the actual world on that chunk. there are system menus accessible by hitting escape in either of those two modes, and you can do things like reset the worldmap or generate a new one, or quit back to the title screen. so fairly minimal things, ultimately, but way more of a UI than any of my games have really had before.

    going forward i'm hoping this two-week stretch is gonna have me rendering a world map in a fancier way and then maybe trying to get it to look more exciting. i have some ideas for what i want a prettier world-map render to look like, and ideas for how i want my world graph expansions to work, but it's hard to say what any of that will look like until i have a demo working.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Aug. 15th, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 11:54 pm

    coroutines!

    continuing to try to get back into the 2-week project loop. this time i tried using coroutines. at first i was like "i could probably write my own" and then i was like, wait there's almost certainly a continuations library i could use. there was; i ended up using monad-coroutine, which i chose over Coroutine b/c it had more comprehensible types.

    i'm still not really sure if this was a good or necessary idea -- the goal was to write some code that i could use to do loading / background loading. these days, threads are generally the thing to use to do background loading -- one thread stays busy loading whatever, occasionally writing status messages to some thread-safe variable, and the rest of the program just displays a loading screen with the status messages displayed, waiting for the loading thread to signal that it's done and return all the processed data. rendering stuff complicates this somewhat: opengl has a constraint that only the original thread that initialized the opengl context (as in the rendering window) can perform actions that require the context, which definitely includes buffer allocation and might also include buffer writes.

    coroutines have been kind of popularized recently by go, i think? in that go's 'goroutines' are basically just coroutines baked into the language syntax. previously i knew of them through the classic of duff's device, which is not really a stunning endorcement of the concept. a coroutine is a pair of functions that return to each other, basically-- one function can yield, at which point the other function resumes where it was when it called, and then later on the other function can yield to allow the first function to resume. this isn't implemented through actual 'return' statements. but a timing-based coroutine -- calculate through this data, yielding a partial message when out of time, and resuming with a new quota of time -- is something that can be used to run loading screens, and it can run in the same thread so i don't have to worry about opengl weirdness.

    that being said, i don't actually know if this is a great solution: this coroutine library might be threaded anyway; the opengl restriction might not involve buffer writes; the opengl restriction is about OS-level threads and haskell 'threads' might not actually fork the real OS-level process. but this is something i know how to do and use, so i guess i'll use it.

    anyway i wrote the infrastructure so that i could fairly easily write a fold/accum function and then lift it up to run through a timed coroutine, outputting partial results as necessary, which was neat, but then i got bogged down in the actual thing i wanted it to do: load and unload map chunks in a priority queue based on distance from the player character. that was a lot more involved problem than just writing some timing infrastructure, so i kinda hit that and didn't really get anywhere.

    oh well that's how it goes sometimes. this code will still be around when i have some other stuff to load, i guess.


    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Aug. 1st, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down (Default)
    Tags:
    • gamedev challenge
    posted @ 10:57 pm

    render buffers

    so it's been a while

    my two-week project schedule got totally wrecked by covid stuff and uh as that stuff is still ongoing it's hard to really tell to what degree i'll be able to get back on track with actually, you know, doing things

    but eventually i got frustrated and started working on stuff again. i actually did an enormous amount of work on that really complicated combinatorics problem i've been struggling with for a while, and made some big breakthroughs with that, but it's not 100% complete so i'm waiting to post on that until i can actually show it off.

    anyway what i did over these past two weeks has been work on render buffers.

    so when you want a program to draw anything to the screen, you have to give it that 3d information: here are these vertices, and these three are connected into a triangle, and so forth. the vertices also tend to have data attached, for lighting or texturing or coloring. material data, that kind of thing. and it's not enough to just have that allocated in memory; due to all the optimizations you see on videocards, you need to feed this data to the videocard in really specific ways. enter the render buffer. a render buffer kinda represents a chunk of memory that's structured in such a way that it can be easily fed to a GPU or stored in video memory and then processed by a game's shaders and ultimately rendered on screen. haskell is a very 'high-level' language, but to effectively do rendering you need to kinda dig down into the low-level stuff again;' this is in large part what libraries like gpipe are doing: giving a haskell gloss over some very low-level rendering operations.

    when geometry changes in the world (like, by terrain deformation) you'd ideally want to be able to erase the old geometry and place that new geometry quickly, by only writing to a small portion of the render buffer. however, a render buffer doesn't really come with an index; all it has is raw vertex data. if you want to related those vertices to some kind of game knowledge, you have to do that yourself. so it's much simpler to overwrite the entire buffer.

    this is also made more complex by size constraints: if you have a flat surface that gets dug into, maybe the flat surface took up only 4 vertices, and the dug surface takes up 16. now you can't just blindly overwrite one with the other, or else you'll corrupt other geometry, since the replacement is bigger. if this is a game where the player can construct arbitrary geometry, either by minecraft-style placing blocks, or more indirectly by placing constructed objects, then you end up with maybe having to handle render writes from arbitrarily-complex geometry. so it's much easier conceptually to just clear and regenerate everything.

    for example, one of the reasons minecraft can be so slow is that it stores geometry in 16x16x16 cubes, and any time any block in that chunk is changed it regenerates and rewrites the entire chunk geometry to its render buffer. so if you have a bunch of complex blocks that all update rapidly, you end up with a chunk that's needing to be regenerated nearly every frame. this is part of why you can have 'laggy chunks'.

    so, there's a tension here with your rendering buffers: you want to be able to only write small updates, as needed, but then you need to do a lot of math to keep things squared, and ultimately you might need to reallocate a larger buffer and copy everything over. but that requires bookkeeping math. conversely, you could allocate oversized buffers, which waste space, but that would let you get away with less bookkeeping and mean you'd have to reallocate less often. there's no perfect solution, because there's a tension between wasted space from too-large buffers, the cost of reallocating, and the cost of your own bookkeeping code.

    what i wanted was just, buffers that for the most part automatically Work when i push render data into them, without me needing to spend a lot of time manually calculating out index offsets and checking for buffer limits. when i started working on haskell rendering again, i kind of decided that i was done with doing anything low-level; that what i wanted to do was make something that properly encapsulated the actual low-level stuff in an abstraction that actually works.

    i got that working, is the takeaway. i did some sync stuff and some threading stuff and some indirection, and a bunch of low-level index management, and now for the most part i can automatically generate and update render buffers piecemeal in an efficient way while externally having a pretty idiomatic haskell interface. that's neat!

    also to test that i added some new kinds of world geometry to the maps. previously all tiles were smooth slopes with hard edges, and i changed that so that now hexes can be split in two or in three or into different kinds of steps, which (once i actually manage to work it into the generation/smoothing step) can add a lot of visual variety to the landscapes. obviously nowhere near as much as i'd like, but way more than was previously possible.

    anyway this basically lets me shove in arbitrarily-complex landscape and tile occupant geometry without having to do any counting or worry about allocations or data corruption. this will make adding new render stuff (currently: plants, rocks) a lot easier.

    • one set of landscapes, mostly glitchy
    • another set of landscapes, mostly working

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Apr. 3rd, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 12:48 am

    oh yeah two week projects

    how about that global pandemic huh. that sure isn't good.

    it was another kinda scattershot project this time too. i expanded CTM support a little by adding in 'property' data. no tiles or objects actually produce property values yet, but that opens up a lot of flexibility in terms of texturing objects.

    (an aside: i play a lot of the regrowth minecraft modpack. my least favorite part is its deep dependendance on 'agricraft', which is a mod that changes how plants grow and gives them real basic 'mutation' values. but on reflection the reasons i don't like it break down into two categories: one, it makes plants all grow the same. no more cacti farms with their weird grids, or sugarcane farms in rows bracketed by water, or melon or pumpkin farms in rows, etc etc. instead, every single plant just grows as a single block with a basic 'crops' render. so it reduces the meaningful variation in growing stuff. two, the mutation values are all really boring and not super well hooked into what the plants do. by default they go up every time the plant spreads, so you inevitably get a maxed-out 10/10/10 seed. but also, those values are just "growth speed" and "produce amount" and then a weight "strength" value that might have to do with trampling or maybe taking cuttings? it doesn't really come up much. it's not really the concept of quantized plant values that i object to, it's that like... instead of a general 'produce' value that effectively ranges from "one produce" to "four produce", it would be neat if that hooked into specific plants' growth, so that something like witchery's snowbells could be trained to grow more snowballs or more ice needles, but not both. and then if that _also_ let you visually change what the plant looked like based on what that value was, that would be cool! you could actually visually identify yr plants, which would make needing to have a big */*/* stat overlay, uh, way less necessary. so basically i was thinking of all of this when i added in prop support, so now theoretically plants can just expose stuff like that and it's a resource pack thing if somebody wants to draw a whole bunch of custom textures for each data value variation.)

    i also got into rendering somewhat more complex models. a lot of this has to do with uv-wrapping the polyhedra i'm generating, so that i can actually texture them. this takes a little extra effort per-polyhedra, and i'm still not super happy with the regularity of all these shapes (nature is famously not all spheres and geometric solids) but it's still pretty nice to be able to draw prism bushes or rhombic dodecahedra trees or whatever.

    i also added some flags to the texturing code and reworked what data is made available when writing UVs. currently the only flag supported is 'linked', which is used for random textures to allow mix-and-match textures on a single object. one of the things this is useful for is drawing a few different 'leaves' textures and have them all mix-and-matched on a tree, rather than having each tree pick its leaf type across all its geometry. one of the things i'd really like to add is a wang tiling texturing mode, since polyhedra have all the edge data necessary to generate a random wang tiling. i'm just not really sure what that would mean outside of the context of polyhedra-generated models -- i definitely don't have the edge context needed to wang tile the actual landscape tiles. which is kind of unfortunate, since that would be neat.

    also i did some more work on rendering indices and updating them, though that's still not really a solved problem.

    this is less a synopsis of what i did and more a kind of aimless ramble but hey it's that kind of time in the world i guess. oof.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Mar. 16th, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 06:10 pm

    so, two-week project...

    i mean, around two weeks ago i was like "yikes ncov seems scary but i guess there are only like five cases in the whole US" and now my city is on lockdown and also i might be getting a fever. so that's kind of scary.

    still, i guess all i can really do is bunker down and try to wait things out and hope things don't get really bad. we'll see how it goes.

    anyway the goal for these past two months was gonna be to put together a gameplay loop, but... i got as far as coding in the interaction triggers and then i thought about all the new assets i'd have to draw and got kind of intimidated.

    i did get some other stuff done, though. after getting the texturing data working last time, i wrote a parser so that i could load 'resource pack' information from a data file instead of having it be hardcoded. something in that vein that's yet to be done is fixing up the texture 'name' code, so that i could add in what would effectively be metadata/property information to the resource pack and have that handle all of the texture selection (like providing different textures to plants at different growth stages).

    i also started using the menu code again -- you might recall i put together a 'form' type in one of my older projects. here i wanted to make a menu control, and so i expanded on that code to support game actions caused by selecting a menu item. this also meant changing the way my render code works to more solidly isolate the gpipe opengl context code (which is needed for render actions) from the general io (which is the only thing that the event code can run directly) -- basically i can't render directly inside the FRP event network, but i can have a 'render loop' running that polls the game data and runs render actions itself, but that means figuring out some way to shuttle that information through the game data. basically i just had to update a few types to handle stored buffers, since most of my code still just reallocs constantly instead of keeping other buffers around to update later.

    so that all isn't an enormous amount of progress, but it's all stuff that's useful that needed to get done at some point.

    who knows what the next 2-wk project is gonna be. "don't die from covid-19" maybe. i mean i've been doing some stuff about generating more complex geometry and texturing it, and i've been looking at some gifs by oldmanofthefire@twitter (and also orchids to dusk) and thinking about stealing that visual style, but listen i'm gonna be honest, code stuff isn't really at the forefront of my mind right now

    hopefully the next two week project update happens as scheduled on april 1st and hopefully the whole global pandemic situation isn't totally spiraling out of control then. i mean, moreso than it already is.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Feb. 29th, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 07:36 pm

    this two-week project was just labelled 'general render improvements'. the rendering code i'm using could do some renders, but it couldn't really do much, and it wasn't really any one critical problem so much as a half-dozen different major issues.

    so, some of those have been fixed.

    over the past two weeks:
    • i got slopes working, for hexes and for tris, and started calculating their normals as well as possible given the vertex limits in play
    • i got slopes generating on the map with a basic smoothing pass
    • i put in some value noise heightmap worldgen, and got 'chunk loading' working more coherently
    • i restructured the buffer allocations slightly to make updates easier...
    • ...and wrote some interaction code that updates the map data and does an in-place buffer write
    • i made some super basic types for tile occupants, and got some plants rendering
    • i tweaked the shader slightly to discard transparent fragments -- this means that i don't have to depth-sort any cut-out, stenciled geometry like plants
    • i put in some rudimentary texturing types, inspired by the minecraft ctm mod

    there are a few glitches with all of those (plants are always drawn flat, so they kind of float on sloped tiles; the render update only works on a single chunk at a time, so hexes at the edge of a chunk end up not updating all their adjacent tris; etc) and there are still some real big rendering limitations -- most notably, the landscape is still a heightmap and can't handle overhangs or any variety of more complex geometry -- but on the whole this has made rendering a lot more structured and useful.

    i also got an idea for a workaround for gpipe's glitchy uniform code, which will help a lot when i need to use uniforms for real later on.

    so that's a lot! most notably, now that i have interactivity working this theoretically means i have all the parts needed to make, you know, a game with an actual gameplay loop in it, which is something that i kind of put off forever.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Feb. 14th, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 02:01 pm

    plan was to work on more graph embedding stuff but i ended up just not touching that code at all, whoops. maybe that'll be the back half of february.

    what i did end up working on was the new twine game i'll be making. specifically, a lot of the "engine" parts. you might ask 'what does a twine game need with an engine'.

    so this game is gonna be a bit more mechanics-focused than the last one, which means i need to think about how to lay out and run that mechanic. really a "game mechanic" is just the short form for "this is a thing that the game will repeat over and over with a fairly uniform interface", so, i gotta put together that uniform interface. it's gonna be something like a detective game, so the first thing i put together (before these two weeks) was a modular dialog system heavily inspired by sentient. (my first exposure to sentient was actually this video). and the thing with sentient is that it really highlights some of the drawbacks of having a modular dialog system. so this is definitely not done, and i think a big part of having the dialog system be workable and fun involves doing a bunch of tweaks and fixes as people play various demo versions and complain about it. but for any of that to happen i first did need a working prototype, so that was what i put together first. (that was this version, which was also incidentally the first version i posted to patreon as part of the development direction poll)

    but then after doing that i realized, well, this game isn't just gonna be talking to people. it's kind of implied that there's some kind of map, since all of these people will be in different places, and also maybe some kind of physical inspection of locations, since, detective work also classically involves looking for physical clues. so i was like, well, i probably need some kind of map.

    in the prior twine game the 'world map' was just a 7x7 box with some 'go north' 'go south' etc links attached. even though that was the simplest possible space, that was still really disorienting for a lot of people, so with this i kind of wanted to try something a little more visual. i also didn't really want to go with "the map is a big grid and every room is square", since, that kind of had a renaissance in porn games and then immediately went out of fashion when people realized how repetitive and prone to making giant dead areas it was.

    also, i recently downloaded a playstation emulator and started playing old playstation games.

    so, enter the 'vagrant story minimap' map system. this was kinda thrown together from a bunch of the existing svg+javascript 3d render code i had sitting around, so it wasn't even that difficult to implement -- the data entry aspect of making the map is a bunch bigger time sink than the actual underlying programming.

    (i'm considering seeing if this map setup is standalone enough to split it off into its own macro set, but tbh twine code has changed enough that i have no clue how to e.g., make this into a sugarcube macro, given the different ways in which sugarcube works.)

    all the demos, in order:

    • v4
    • v5
    • v6
    • v7
    • v8
    • v9
    • vX

    this still isn't fully complete. i mean, obviously, this is a big game project and will take a while to finish. but nearly all of the fundamental Interface stuff is done, and mostly what's left is writing a billion lines of dialog and description and sex scenes and the like. mechanically, i still need to add in some more special-casing support for dialog options, and at some point i want to add in blocked exits and secret exits to the map. and of course if i want there to be investigation sections i'll need to add that too. but for the most part, the game has enough of a skeleton that it's kind of clear how it will be played, even if there's basically no content in it yet.

    anyway so that's the 2-week project for these two weeks: 3d map interface code in twine, or, more specifically, doing the initial setup for the werewolf twine game

    i'm really gonna need a name for this so i'm not just gonna be calling it "the werewolf game" like i did for the prior twine game, which was "the locust porn game" up until basically the very end.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Feb. 1st, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 09:22 pm

    updated shader math

    okay two-week project reportback

    i rewrote the shader! this involved making a bunch of fiddly changes to the shader equations and what vertex data was stored and a bunch of other stuff like that.

    the three main changes i made were:

    1. the shader itself now does the lighting calculations, rather than me having to precompute them in the vertex color data. this means i can now have moving lights (well, a moving light) and opens the way to having a diffuse/specular material system, which would be kind of neat.
    2. atlased textures can now be wrapped. opengl has built-in support for wrapping a single texture around to repeat it, but you can only wrap the entire texture. once all the landscape textures have been atlased into a single big texture, in order to 'wrap around' it would have to wrap literally across all the other unrelated textures in the atlas before repeating itself. that's definitely not what i wanted. so now the vertices store some texture dimension data and i can wrap an individual texture as much as i want
    3. this is the big one and the reason i started doing the shader rewrite in the first place: the uv mapping of textures has been changed to be way less distorted. i posted a comparison image on monsterpit: first the hex textures and then the wall quad textures.


    so that's neat!

    after that i drew some bigger 64x32 textures, which i think might be the size i standardize on for all the future stuff -- big enough to have some detail, but still small enough to effectively be pixel art. something to do in the future is to put together some more complex connected textures code so i can repeat textures in less repetitive ways, or have overlay textures for adjacent tiles, that kind of thing. i think that would make a huge difference in terms of making a landscape that looks good.

    also i got a little bit of the way through reimplementing hex slopes, but since i've done that a few times before i wasn't really super enthused to be doing it again. one of the big goals this year is to actually make progress on stuff instead of constantly retreading old ground, so it kind of sucks to have to (well, 'have to') recode old stuff to use the new buffer allocation code. i also did a little bit of work sketching out data structures for world data / buffer data that should work for stuffing arbitrary information into later on -- since the ultimate goal is to have much more complex landscapes than just a barren heightmap, i'm going to need game data structures that can actually store all that information in a reasonably structured and efficient way, and i think i have some pretty decent starting points.

    that being said i'm so sick of render code, so the next 2-wk project is gonna be all about graph grammars. like, now that i have a world data structure and some stuff rendering, it's actually time to start thinking about worldgen, and, specifically, finally bridging the gap between the world data and all those svg hex dungeons i've been working on for ages.

    so the next two weeks are gonna be graph expansion bugfixing + trying to work out some actual expansion decks that generate an interesting space. that's an entirely different kind of problem, so, we'll see how i do at it.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Jan. 14th, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 09:50 pm

    so, two-week project reportback

    this time: render buffers. as i mentioned last time i'd been kinda circling around rendering problems for a while, and in december i'd spent some time writing Storable instances for heightmap data.

    so uh the first week or so was spent working on that: i wrote new code that manually allocated a 'chunk' of heightmap data and stored that, rather than having a haskelly data structure. i also decided to try overallocating space for buffers so i could statically index their values -- basically certain parts of the world geometry might be 0 vertices, as in they might not exist at all, or they might be 4, or 8, or 12 vertices. allocating the correct value for each one would make the math a lot more complicated, so instead i'm allocating space for 12 vertices for every piece, even if the actual map doesn't use it at all, because then if i want to look up "where is this chunk of world geometry stored in the buffer" it's just a matter of counting out however many 12-vertex chunks come before it, instead of having a big lookup table.

    so i did that and performance increased drastically. instead of the program taking up a ~2gb allocation to render 7 chunks, it took up ~200mb. increasing that to 37 chunks only bumped that up to ~250mb, so presumably a lot of that is just general haskell allocation. it also runs faster, since, all that allocation it was doing took time.

    this is kind of wild -- like, i knew haskell allocation wasn't really very fast, but it's still pretty wild to see a 10x improvement just from switching to a simpler allocation model.


    steps in redoing the render buffer:
    at first i tried only rendering hexes
    then i moved on to hexes and tris
    then joins
    then i actually calculated out tri values correctly
    and then finally i got tri values calculating across 'chunk' edges so that there weren't big visible seams

    this code is still pretty basic and i haven't ported over a bunch of stuff that was in the older code yet, most notably slopes, and i didn't get around to writing up the code that would actually update chunk height data + rerender hexes, which is the big thing that i'd have to do if i want to actually use this code to, you know, deform the landscape or w/e. still, this is a pretty big chunk of work.

    i kinda squandered the latter half of this project, and that's because uhhh a few days ago i finally released the big twine game i've been working on for the past two years. kinda burying the lede here. so that's out! check it out! but also kinda obsessing about its release made me incapable of focusing on doing the nitty-gritty work of fixing up some buffer indexing code, so, i didn't actually get a huge amount of stuff done for this 2-week project.

    still calling it a success though, because, wow 10x improved memory allocation.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Jan. 1st, 2020
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 08:17 pm

    okay gamedev thing

    i didn't make a post for decemeber 1-15 because i didn't get an enormous amount of stuff done -- i mentioned in the last post that i wanted to get render buffer update stuff working but i got kinda sidetracked thinking about overall render efficiency. i tested out rendering larger sections of land and came to the conclusion that letting haskell do the memory management for certain kinds of render stuff was just, enormously inefficient. rendering a given chunk of land made the program eat up ~2gb of memory, when doing a little bit of math made me realize that it actually could've been something like ~70kb of heightmap data and ~256mb of vertex/index buffer data. not all of that 2gb was haskell stumbling over lazily allocating landscape stuff, but probably a pretty large chunk of it was.

    also i then spent some time writing Storable instances for some more efficient data types only to realize a bit later on that i should've been thinking more about vertex stuff -- the bottleneck here isn't the raw memory required to store some landscape data, it's in how to construct and update the data sent to the graphics card, since that has to be a bunch of vertices. so actually packing a 'chunk' of landscape into 10kb isn't really that helpful, since that was never really the problem; the problem was being able to render out that landscape into a compact, updatable chunk. so after that i kind of lost steam, although it's something i really want to revisit now that it's 2020 and i kinda want to actually, you know, make progress with this vague game concept instead of just spinning the wheels forever.

    and then december 16-31 was mostly FINISH THE TWINE GAME. it's been in its final stages for months, and it's just needed a little more writing. this wasn't quite a success (i got sick at the last moment) but it's down to two sex scenes needing to be written and two other scenes needing to be edited, although after that i'm still gonna need to read through the entire thing to make some edits and tweaks. that's gonna be a big undertaking by itself, since the thing is at 280k words at this point. that's long.

    still you know, progress on both of those things even if it wasn't as much as i would've hoped.


    as a brief summary of the two-week projects this year: i started 2019 just piecing together the polyhedra data model i've been using for all sorts of stuff.

    • in january i used the polyhedra code to make planetoids, which was the first real usage of it. i put together a mostly-working FRP system, and used that to put together a basic ui forms library, which was a big step towards being able to, you know, manage input in any way beyond the totally trivial.
    • in february i got uniforms attached to shaders in gpipe, which really opened up how i could use shaders.
    • in march and april i got 'stretchy edges' working for graph grammar dungeons, which hugely expanded the ways they could be used and made it possible to generate way more complicated graphs.
    • in may i realized how to write the 'loft' operation in the polyhedra data model, and that that's basically a really simple way to extrude out complex shapes from simple rules.
    • june was apparently partly me working on wang tilings and partly some additional bugfixes for stretchy edges.
    • in july i apparently did nothing? mysterious.
    • august was when i got picking working, which was another huge step towards actually having any kind of meaningful interactivity -- like, on a low level, it's kind of amazing to think about how many games have an interface that is entirely "click on a ui element" or "click on an object in the world", and so now that i have the actual basics down it's gonna be way easier to actually, you know, run interactions. august was also when i wrote a new triangular-texel shader and worked out some basic texture atlasing for rendering world geometry.
    • september i used that uniform code plus the new shader to actually, uh, start for-real starting to use some uniforms in my shader code. i also tried some polyhedra coordinate systems, but that didn't really fully cohere.
    • october i put together the first render buffer code, as well as outlined some better syntax for my combinatorics data file code, which is gonna be real helpful when i next come back to that
    • november was when i did procjam 2019 and i made that knotwork star generator
    • december was kind of underwhelming, although the render allocation stuff is probably gonna pay off sooner rather than later.


    and a lot of that stuff is built on top of the 2-week projects i did in 2018 -- notably, like all the rendering and event stuff was only possible b/c on a whim i went looking for haskell rendering frameworks and messed around with gpipe for a bit. the whole existentially-quantified container concept that my forms UI library is built around started then, when i started looking at how reform did that internally. so here's hoping that some of the more disconnected projects i worked on in 2019 are things that i tie together and polish up in 2020! i guess we'll see in a year.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Dec. 4th, 2019
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • gamedev challenge
    posted @ 08:26 pm

    so, two week project reportback

    not a whole lot to show for this one -- i started working on a little HTML5 game, but it was hard to stay enthused about it and it got totally derailed when i started playing morrowind again, whoops. (that being said i'm not gonna link to the half-finished thing b/c maybe i will go back and finish it for another two-week project at some point.)

    working on smaller projects is neat in that there's an actual semi-finished Thing at the end, but it's hard to stay invested in working on it when it doesn't really... give that sense of building towards something that i actually want to make, i guess? which is a little unfortunate, since it'd be good to focus on the entire process of making a thing rather than perpetually getting stuck in redoing really basic infrastructure things.

    speaking of, my next two-week project is gonna be working on rendering buffers again, ugh. there's actually only a little left to be done (one big bugfix, mostly) and then i'll have enough foundations in place to start mucking around with some actual gamedev stuff, rather than index counting and data marshaling. something i really want to work on this next year is accepting when something is Good Enough and trying to push through to work on new things, instead of endlessly belaboring the same system until it's perfect and bug-free. i've kind of incidentally assembled a bunch of gamedev tools during all of these two-week projects, and it would be really great to get to use them in a big project i care about, which is only gonna happen if i pull myself out of the low-level graphics-coding mire.

    also, itch.io has this analytics thing that shows where people are coming from, and it's a little weird to see how big a percentage is from this blog. i mean, i know i link it everywhere, and i know intellectually that there are in fact people who read these posts, but it's still a little weird to see a precise number of clickthroughs. anyway, hi. hopefully next year i'll have some bigger and more dramatic gamedev things to show off! we'll see.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Nov. 15th, 2019
  • xax: purple-orange {11/3 knotwork star, pointed down ({11/3}sided)
    Tags:
    • demos,
    • gamedev challenge
    posted @ 10:15 pm

    knotwork star gamedev reportback

    i made a new icon!

    (i'm actually probably gonna tweak it a little more before i'm totally satisfied with it. it needs a little less red.)

    so around ten years ago i wrote this wonky c program that drew a closed loop and then split it apart into inner and outer loops and then ran intersections against each segment and split everything apart into a bunch of intersections and rendered a woodcut / celtic knot style polystar, and that's how i got that seven-sided star that's been my icon online for, oh, years now

    but lately i've been kinda feeling like i needed a change.

    it was weird revisting that problem. about half of the way through i realized i'd reimplemented everything i'd done before, and i was now facing the problems that i couldn't manage to figure out last time; there was this sense that i'd been following in my footsteps going "i did this last time" only to eventually realize that oh right, last time i ran into intractable problems and gave up, and if i wanted to actually succeed this time i'd have to actually think about how to do it, rather than trying to dredge up my memory of how i did it ten years ago. ten years ago i was kind of dumb and i've definitely gotten better at programming since then.

    (there's this whole thing how if you want to space out woodcut gaps evenly you can't displace a fixed distance along the line intersection, you have to get a line's normal and displace the line along that by the fixed distance, and then get the intersection of that displaced line. this was something that i couldn't really figure out how to do, in c, ten years ago, and that meant that my old icons from that time didn't actually handle the case where there were multiple crosscut intersections on the same shape -- the gaps would be different sizes based on the angle of intersection. that is no longer the case for the new generator.)

    also, for basically the first time ever in doing one of these, the result is online for people to check out! it's over at my itch.io page!

    it's not perfect, since it's kind of blindly picking crosscut over/under pairings and some of those are literally impossible to render correctly, and also there are still some bugs, but on the whole i'm pretty proud of it, given that i got the bulk of it assembled and working in like a week.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Nov. 1st, 2019
  • xax: yellow-orange {7/2} knotwork star, pointed down (7sided)
    Tags:
    • gamedev challenge
    posted @ 01:41 pm

    october gamedev project reportback

    i am very behind on gamedev project posts

    okay the project from october 1st to october 15th was render buffers! it was a partial success.

    going into it all i really knew was that rerendering was very expensive. i was guessing it was the part where my code entirely reallocated a buffer every time instead of just mutating values in the existing one, but i didn't really have any firm idea of how to do that in haskell.

    specifically, because i wanted a big list of render actions, i was using existential quantification to hide the type signatures in render objects: each render object would store a shader with state s (like, texture references) and taking vertices of type a, as well as a static state value s and a render chunk representing a big buffer of type a values. and then it would make all those types vanish under quantification, so i could mix and match different shaders without having to do any special work with different constructors for each shader. and that was great for that purpose, but that also means i had thoroughly obliterated all the type information for my vertex buffers, so it was impossible to ever rewrite them. in fact, even if i had the information for the buffer i still couldn't rebuild the shader object, because the shader type a would no longer unify with the buffer type a! (the program has basically forgotten that they're the same type)

    so anyway i solved that problem by just duplicating the value, after verifying that 'duplicating' the value was actually just taking a new pointer to the underlying buffer object and not, you know, actually copying the buffer itself. so then i could put one reference into the render object, and store the other in some non-quantified list that does have different constructors for each shader.

    so that all works, but that's still not all the way to usability, because the actual goal is being able to update your buffers, which requires some additional infrastructure for tracking what's in the buffers and where and how to update them piecemeal. this part still doesn't work correctly; my landscape generation code seems to work differently when i'm rebuilding the entire landscape vs. when i'm only rebuilding a few tiles, even though it should be the exact same local context in the cases i'm using. also my indexing math isn't right or something so i end up corrupting all my vertex data somehow. but i did get far enough along to verify that yes mutating only a few indices is super fast compared to reallocating the entire buffer. that's not really a surprise in any way but it's nice to confirm that yes, that is where the hitching is from.

    theoretically when i work on this next it'll be something like... trying to make values/a typeclass for piecemeal renderables, so that i can have a uniform interface for any kind of object that i can render only select parts of, which would let me very easily just be like "okay all these objects updated their data in some way, so regenerate only the relevant parts and rewrite only the relevant parts of their buffers" without having to actually think about buffer writes, which is my general goal with this code.

    so, partial success. doesn't actually work, but the foundation is solid enough.




    the project from october 16th to october 31st was data file syntax! it kind of petered out at the end as i got super into dwarf fortress again (more on that later) but i made a bunch of progress.

    so as you might know, i've been messing around with procedural generation from data files for a while. it was my very first gamedev two-week project! huh, two years ago exactly.

    i messed around with that concept a lot and rewrote the parser several times, but i never got it parsing precisely how i wanted (its whitespace rules were kind of finicky b/c newlines are sometimes terminators and sometimes not). also, as i added more features i started just bolting on syntax. "oh, i need some way to reference a generator's own attributes" "oh, i need some way to do list operations" "oh, i need some way to add or remove generators from the active item list". and i'd see that problem and add a new parsing rule for a new kind of action and keep going, but since i didn't actually plan out anything the parsing rules got super complex and the data files turned into this slapdash nightmare where there wasn't really any theme or metaphor to approach what it did; what it did was just "it does all the things i've specifically programmed it to do".

    so i wanted to 1. get the syntax to be more legible even when it's doing something complex, and 2. to provide a sense that it's actually dealing with some kind of underlying conceptual object, rather than just running a bunch of rules i wrote in some order in some fashion.

    this was kind of a weird one, since usually i wouldn't really consider "writing some fake data files and a parser to read them" to be like, work? it's pre-work. but in this case i've already written the internal combinatorics code that this would use, several times, and in fact the part where i was thinking about the data organization as not work and not important was what lead them to be such a tangled mess all the other times i worked on this concept.

    this one i didn't really finish, because i got distracted, but i made a bunch of headway. i got the parser working and parsing in data in the ways i wanted, and i added only a limited amount of syntax. it's hard to say how much of a success this was without being able to actually run code, since a lot of the mess i made previously was due to the ad-hoc nature of realizing a needed a feature and bolting it on to the existing edifice, but now i at least have a better idea of how i'm gonna add in new syntax, rather than just going like "oh well i'll add $foo values" "and ^foo values also" "oh wait i also could use [foo] and {foo}". and so forth and so on.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Oct. 4th, 2019
  • xax: yellow-orange {7/2} knotwork star, pointed down (7sided)
    Tags:
    • gamedev challenge,
    • programming
    posted @ 12:14 pm

    gamedev reportback: polyhedra coordinates

    so i wanted to take a break from rendering stuff for a while and work on something more theoretical, and decided to try polyhedra coordinates

    the goal is to eventually get polyhedra coordinates working with my graph generator, so that instead of making flat 2d maps i can wrap them around the surface of a planet, and use a different set of graph expansions to simulate (in a super rough and simplified way) some geological processes that will end up generating an entire planetary map

    that being said, this two week chunk of time did not accomplish that. i have a good idea of where i went wrong and how to fix it, but, out of time so i'm gonna stop working on it for the time being.

    (the thing with coordinates on spheres, or polyhedra generally, is that there's curvature. this comes with two general classes of complications: one, the math can't be everywhere-the-same in the way a flat square grid is always going to be x±1, y±1 for adjacent tiles, everywhere, since due to the nature of a coordinate system the distortion gets pushed around so that it's more present at some locations than others; and two, since the polyhedra wraps around, there's the question of what that looks like in the coordinate system. a wraparound square grid is like a torus, in that you can just say "left and right edges connect; top and bottom edges connect", which leads to some very simple math like if x < 0, x + x_max; if x >= x_max, x - x_max to perform the wraparound. on a polyhedra things like orientation come into play, where directions don't mean the same thing after crossing an edge.)

    i had this whole idea of using a winged-edge data structure (which i've been using for all those polyhedra renders) for tracking the structure of a world, and mixing coordinate math with graph-traversal code to figure out paths.

    so like, specifically, i'm interested in doing coordinate math on G(n,0) goldberg polyhedra. the existing libraries i've seen do this are there are earthgen and hexasphere, and they both handle things with a giant graph: each hexagon (or pentagon) tile is a node, and the six (or five) tiles it's adjacent to are the other nodes it has edges to, and they handle all pathfinding as a pure graph-traversal problem. that's fine for small shapes, but when you get to gigantic planetary-sized ones that's like, a five million node graph you have to keep in memory to do anything. also in a graph representation, there's no real sense of direction; you can't really say "go straight" since there's not a clear way to map edges to "opposite pairs".

    my idea had to do with the underlying shape: all G(n,0) polyhedra are fundamentally shaped like dodecahedra/icosahedra (they're duals so it's kind of the same shape). that means that no matter how big the polyhedra gets, it'll still be topologically the same; there will just be more tiles there, but the fundamental shape of its distance metric won't change. so instead of storing a million-part graph, you can just store an icosahedron graph, and that's actually enough.

    all that theory still seems sound, it's just when i got into the actual nitty-gritty coordinate math i ended up doing some math wrong and got impatient and kind of lost the thread and ended up with a mess that didn't come close to working. so that's a little disappointing, but, oh well, it's all just practice for the time when i actually get it right.

    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Sep. 14th, 2019
  • xax: yellow-orange {7/2} knotwork star, pointed down (7sided)
    Tags:
    • gamedev challenge
    posted @ 05:21 pm

    okay so this last two-week project was about getting a player model controllable and placed in the world. i got about half the stuff i wanted done? the big things are i got uniforms working for the shader, so now things can be positioned in a moderately efficient way, rather than having their buffers entirely regenerated and reallocated just to move something around by a few units. so that's helpful. i also restructured the camera code some -- it was pretty tangled up due to the way i had initially written the camera transforms. the usual case of all the equations being wrong but being wrong tuned so that everything worked out in the end, just leading to some internal state that was persistently wrong. which was fine until i needed the correct internal state for other equations. i also added some really basic 'height' controls, where the player model (a hexagonal prism) is snapped to the height of the tiles it's on, and the camera correspondingly follows it. it's not super pleasant to move around, but it works

    i wanted to add in collision and movement on slopes and maybe even jumping/falling, but after thinking about it i realized that i'm not actually storing reasonable collision data anywhere -- i'm rendering the world geometry, but i'm not keeping track of spandrel values, and the current code that's used to figure out which hex the player model is on calculates the value for a hexagonal grid, rather than the somewhat more complex tri-hexagonal grid i'm using. so i'd have to change around how i'm calculating and storing map data, and that just... wasn't something i really wanted to work on a bunch right now, honestly.


    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
  • Aug. 31st, 2019
  • xax: yellow-orange {7/2} knotwork star, pointed down (7sided)
    Tags:
    • gamedev challenge,
    • programming
    posted @ 10:14 pm

    so this 2 week project started out as "work on the rendering pipeline", since that's slow and bad, and ended up morphing into mostly being about texturing.

    the original plan was to do low-level rendering changes, stuff like doing in-place render updates, or just reuse of existing buffers rather than every update needing to be a full reallocation. the main problem was that doing a growth tick took like a third of a second due to the re-render, and that was just... not acceptable performance under any circumstance. but it turns out fixing that is difficult

    i did end up restructuring the code to use vertex indices (instead of duplicating vertices), and i split render updates into landscape updates and object updates, which did speed up the growth ticks a lot, but it didn't really solve the underlying problems.

    instead i got textures working. this involved a few things: writing a shader to do triangular texel mapping, changing the rendering code to produce correct uvs for all the world geometry, asset-loading code to load different textures, some tile data-handling to get their information into the right places where it could be used to figure out what texture they should use, texture-atlasing code to merge textures together so that i could merge geometry together into one big chunk that all shares the same gl texture, and some very rudimentary code to handle polyhedra "models" that generate the correct uvs for certain shapes. in the process of doing this, though, i did break (more like disable) the old lighting code, and i'm not sure what i'll do to replace it eventually.

    so like, that's a fair chunk of stuff that's important and needed to be done, but it's not quite the low-level graphics stuff i was anticipating working on.

    • tile uvs
    • less eye-searing uvs
    • triangular texels
    • texture atlasing
    • correct wall uvs
    • way faster growth ticks (anigif)
    • uvs for simple models

    this has been kind of funny, since the new textures kind of look terrible? so all this work has basically just made the game look less nice, until i draw some less bad textures and work out some ctm stuff so it's not all minecraft-style repeating tiles. it's a big technical achievement, though.


    • Add Memory
    • Share This Entry
    • Link
    • 0 comments
    • Reply
    • Previous 20
    • |
    • Next 20

Syndicate

RSS Atom
Page generated Aug. 12th, 2025 10:48 pm
Powered by Dreamwidth Studios

Style Credit

  • Style: (No Theme) for vertical