Published on 12th April 2024
Welcome to Phaser World Issue 175.
Game of the Week\ Fush\ It's a fish puzzle game with two puzzles. Have fun! Also, try clicking the buttons on the radio.
Two easy puzzle minigames\ I just wanted to share with you these 2 puzzle minigames - they can provide some fun.
Neighborly Love\ Programming, design, and sound by me. Art by my wife.
Tilemap spotlight with dark rooms\ Move with arrow keys.
Phaser 3 Tutorial: Getting Started with Shaders\ Interested in learning more about WebGL and how we can use this API to create awesome shaders, and how we can use those shaders in our Phaser 3 games? Then you will not want to miss this video.
Build a 3D HTML5 game like “Stairs” using Phaser and Three.js – Step 1: building an endless staircase with random spikes\ I am able to show you the endless Phaser + Three staircase, and I also added random spikes on each step, as well as a fog effect. See also part 2, part 3, part 4, part 5, and part 6.
We have a comprehensive look at the new Phaser renderer in Bens's "A Week in the Pixel Mines" entry below, but the more 'normal' Dev Log has been put on hold this week to make way for something a little different.
Because today, Phaser is 11 years old! Woo! Happy Birthday, us!
It's very nearly a precocious teenager :)
Something that I really wanted to do on the new site was to collect together all of the Phaser Dev Logs into one single location. So that is what we've done! Prior to this, the dev logs were spread over Patreon, my WordPress blog, Google Mailing Lists, and GitHub. After a lot of work, node scripting, and countless import errors, we've collected them all in one place. There are a staggering 294 of them in total! And I'm typing out another one right this moment, so 295 :)
Let's take a peek at a few highlights from over the years.
How it Started
On this day in 2013, I released version 0.5 to GitHub and published the first ever Phaser Dev Log. In it, I talk about how I was inspired by the Flash game library Flixel and how I wanted to bring that to the web. And Phaser was the result. The following day, Ilija made a post featuring his sketches of what would become the Phaser logo and characters.
Towards the end of June 2013 I was closing in on the 1.0 release in this dev log. Here you can really get a feel for how things are coming together. New features are dropping in and the structure is getting solidified.
4 months later, and Phaser 1.1 was out. In Dev Log 6 I'm clearly happy with how things are going :) And you can see the start of what made Phaser popular: hundreds of examples, full API docs, etc.
It wasn't long after releasing v1.0 that v2.0 came around. This was a dramatic change for Phaser as it integrated Pixi internally, giving it WebGL rendering support and you can get a sense of how important this was in Dev Log 12. We kept with Pixi for the whole life of Phaser v2 and it was a fruitful relationship. As most of you probably know, we built our own WebGL renderer for Phaser 3. The reason was that development on Pixi had really stalled. Goodboy was just so busy with client work that Pixi had taken a real back seat right at the time when Phaser was growing and growing. I often wonder what would have happened had Pixi kept pace with us back then - would we be on Phaser v8 today powered by Pixi v8? Who knows :)
In June 2014 Apple dropped WebGL support into Safari on iOS. This was a massive thing at the time! Which is why I covered it in Dev Log 15.
By January 2015, we had started work on the first iteration of the new Phaser 3 renderer. This was being built by Pete Baron and you can find details in Dev Log 19. Through-out 2015 you can see we're releasing more products and working through the evolution of Phaser 3.
Progress on Phaser 3 was steady and consistent throughout all of 2016. In addition to v3, I was constantly updating Phaser v2 and releasing new tutorials and plugins. The Phaser Patreon was running at this time and you can see the influence it has on my ability to dedicate more of my time to Phaser work. It was a very productive year! It concluded with me suggesting we should rename Phaser 3 to Lazer. That was a strange period of time, and I wrapped it all up at the end of the year Dev Log 97.
Development of Phaser 3 carried on for the whole of 2017, as you can see in all of those years Dev Logs! Some key events happened, including getting a grant from Mozilla and lots and lots of Beta releases. By early 2018 we are getting very close to release, yet I was always looking forwards - as you can read in Dev Log 151. Even with v3 not even released, I was already planning how we could improve it by adding 3D and moving to ES6!
Finally, in Dev Log 158 Phaser 3 is released! Oh wow, that was so much work. Virtually every single part of Phaser had been rewritten from scratch, some multiple times. It had taken years to release. In hindsight, it took too long. It was too ambitious, and we tried to do too much. It took years for the new API to 'settle down' and lots of bugs to get ironed it. Phaser 3 today is a vastly different beast from what it was in 2018. It's much more mature and battle-tested.
Just over a year after v3.0 came out, I wrote in Dev Log 214 about Phaser 4! The paint was barely dry on the walls, and already I was looking to throw the baby out with the bathwater and recode it all in TypeScript. The spaceship in this illustration sums it all up:
Throughout the 2019 Dev Logs, you can see lots of R\&D progress with Phaser 4 and continued releases of Phaser 3. Then, in 2020, that little pandemic hit the world. Development work carries on with the release of Phaser 3.24 and more Phaser 4 updates.
It's safe to say that looking back on it, by the end of 2021 and for the first half of 2022 I am utterly burned out. This is reflected in the quantity of Dev Logs during this time. It's a full year between the beta 4 release of 3.60 and the beta 17 release. As the year progresses, the Dev Logs get bigger and more in-depth, such as Dev Log 270, all while having to deal with some tragic personal news.
Phaser 4 hasn't gone anywhere because I think I've come to the realization that, actually, doing it all again is a pretty stupid idea. It will just fragment the user base even further. So, I stalled the idea and started to figure out how Phaser 3 could benefit from the Phaser 4 code.
Many of the new techniques I learned while doing that R\&D fed back into Phaser 3, manifesting in the v3.60 release (and the subsequent 3.70 and 3.80). Honestly, these feel like the most feature-rich and stable releases of Phaser we've ever had, and you can clearly see I'm happy with the progress in Dev Log 275.
How It's Going
And then, on the 18th of December, 2023, in Dev Log 285, I talked about the formation of Phaser Studio Inc with OCV.
With significant funding in place, I start to recruit a team around me, and we have been absolutely firing on all cylinders since then! Things couldn't be more different than this time a year ago. Phaser is stable, we're building up great tooling around it, devs are releasing amazing games with it and honestly, the future is bright.
Looking back I can definitely see a few key moments in the past where I should have done things differently. But no-one is perfect and hindsight is 20/20. I made the best decisions I could at the time. Now with the team in place, we will make the best decisions we can going forwards.
There are some really exciting things on our roadmap and we will start to unveil more of these in the coming weeks. So stay tuned, things are going to get even more exciting :)
Ben: 2024.04.12
This week saw the solidification of some fundamental parts of the new render system.
Framebuffer wrappers\ DrawingContext life cycles\ RenderNode standards\ Batch standards
It's all a bit technical, but I'm trying my best to make it an accessible system, where everything obeys clear rules. So let's see how those rules are shaping up.
Framebuffer Wrappers
We introduced wrappers in Phaser v3.80 as a way to handle context loss, but they're rapidly turning out to be a powerful abstraction for managing the whole WebGL state. They let us know when we can skip a WebGL command, when the new state would be the old state, and I've talked before about how many commands that saves.
Framebuffer wrappers give us additional powers. A framebuffer is a collection of attachments, including a color attachment and optional stencil and depth attachments. You can think of it as a clipboard with several sheets of paper attached. When we draw to the framebuffer, we actually draw to those attachments.
The wrapper now tracks those attachments, and has a place to mark which attachments have been drawn to and which are scheduled to be cleared. I've removed some properties such as width and height, because it turns out a framebuffer requires all its attachments to match, and we can just derive those from the color attachment, which is always a pre-existing texture. I've also added options to define the depth and stencil attachments separately, because Phaser doesn't use the depth attachment for anything and we can save some memory by skipping it.
Most importantly, the new wrapper now supports no framebuffer. Or, more specifically, the "canvas framebuffer" provided when we create the WebGL rendering context. This was previously not addressable in the system except by binding null to gl.FRAMEBUFFER. With support for this, the framebuffer wrapper can track use of the canvas itself, which turns out to be quite useful.
DrawingContext Life Cycles
A DrawingContext is a part of this new rendering system. It's one step up from a framebuffer, and contains information about stencils, scissors, and such.
You can think of a DrawingContext as a drafting table, upon which the framebuffer can be placed.
One framebuffer can be used by multiple DrawingContexts at the same time. This allows us to pass a framebuffer around the render system, without losing track of what each context is doing. This is an important part of making the system more robust and encapsulated.
A DrawingContext has three key moments in its life cycle:
use\ beginDraw\ release*
These are all used to ensure that the framebuffer is set up correctly, and is cleared as efficiently as possible. Like all complicated things, it's very simple.
When a DrawingContext enters use, it tells the framebuffer that it's being used. The framebuffer counts how many contexts are using it.\ We check the framebuffer's contents against the context's autoClear bitmask. If it has any contents left over from the last time it was used, we mark this on the framebuffer.\ If we draw to the framebuffer, and there are leftovers, we clear those attachments just before drawing.\ When we draw to the framebuffer, we mark those attachments as having contents.\ When the DrawingContext leaves use, and there are no more contexts using the framebuffer, and there are leftovers, we clear those attachments.\ When we clear attachments, we remove the contents marks and the leftover marks.
This is now so efficient it can actually break WebGL Debug in the example labs. If there is nothing to draw, it clears the canvas and then just stops issuing commands. Spector waits for a command to bring up the debug interface. But nothing is happening, so you have to add something back to the canvas for the debug to appear.
Is this necessary for performance? No - we're talking about the difference between drawing nothing and drawing an empty screen, neither of which is very demanding. However, this logic is necessary for handling "spare" framebuffers, which could show up with any weird stuff on them, and could have nothing inside them. This just ensures that they render efficiently and don't get fooled by edge cases.
RenderNode Standards
I've been trying different names for the components of the new render system, and I've settled on RenderNode. The render system is effectively a tree, and these "nodes" form the branches of that tree.
Each RenderNode has a run method, which takes some input, may do some rendering, and may have some output. RenderNodes frequently call sequences of other RenderNodes. They may even call themselves, because they're singletons: instances kept in a manager and invoked when necessary. The actual graph is implicit in the function call stack.
We can still track the graph, however. Here's an actual output from earlier today. This structure is not at all final, because I'm still breaking up some nodes and getting batch rendering ready, but it gives a glimpse into how the structure works:
__Camera\ ListCompositor\ Single\ TransformQuad\ Single\ TransformQuad\ Single\ TransformQuad\ Single\ TransformQuad\ __
As you can see, Phaser starts rendering with a "Camera", which processes a list of child game objects ("ListCompositor"). The "Single" node assembles the vertex data for a single quad, using "TransformQuad" for some general purpose computations, and passes it to the renderer to put it on screen.
We're going to make plenty more nodes, as we shift over from the Pipeline system. My hope is that this will be more accessible, and open up new rendering opportunities.
Batch Standards
One reason why Phaser runs so fast is its use of batch rendering. This is a simple piece of good practice: render as many things at once as possible. In practice, that means loading lots of data into a single vertex buffer, and running a single shader program over it.
Batching has its limits, however: everything must use the same settings. It must use one shader program, with the same uniform settings, the same textures, the same blend mode, etc. So if we want to render a different shader, such as for FX on a game object, or use a different texture, we must finish the current batch and start a new one. (Phaser is clever, and uses as many texture units as possible in parallel to make larger batches. But the limitations are still there.)
I've come up with some terms that help us reason about batches. This was all implicit in earlier versions of Phaser, but I think giving things names helps us to use their full power.
When we render a game object, we call its renderWebGL method. This currently exists, but generally interfaces with the pipeline system and doesn't return anything. I'm updating it to use different parameters and optionally return data for use in batches.
A Stand-Alone Render (SAR) returns nothing, and should handle all its own rendering.\ A Standard Batch Render (SBR) returns information that can go into a batch, and be rendered later.
Obviously, SARs create more branches on the tree.
SBRs expose an important fact about Phaser: it has a standard data type for drawing batches. But what exactly is that data type? Again, it's implicit, and I want to make it explicit.
The batch must contain a set of geometric primitives. These should be either completely untransformed, or fully transformed to screen-space coordinates. We want them fully transformed, because this means the SBR can decide how to handle the camera itself, and some draw operations such as filled rectangles ignore the camera transform.
So we don't have to output any quads. We can put out triangles, defined by a series of vertices. Each vertex contains position, texture coordinate, and tint color. The triangle itself contains texture, frame, and tint mode (which all cannot vary across a triangle, so far as I understand). And the whole batch output contains blend mode and FX data.
The batch system can then use this scoped data to decide where batch boundaries fit.
(This is a lot simpler than some options I've been looking at, which had to pass out transform data, or make trade-offs with repeated quad transformations, which are impossible because quad transforms take axis-aligned input but don't guarantee axis-aligned output.)
These standards take in a surprisingly large number of Phaser's draw operations. For example, a filled rectangle (such as a camera flash) is actually a batch operation: it draws a blank white texture, which can be tinted to any color.
We still have to break up batches around blend modes and FX, however. And that's what the whole render graph approach is for - it gives us the mental tools to understand the drawing sequence. I hope that it helps us make Phaser more stable and efficient, and maybe even empowers users to make that power their own.
Phaser Releases
Phaser 3.80.1 released 27th February 2024.\ Phaser Editor 2D 3.67 released 22nd February 2024.\ Phaser CE 2.20.0 released 13th December 2022.
Have some news you'd like published? Email support@phaser.io or tweet us.
Missed an issue? Check out the Back Issues page.
© 2024 Phaser Studio Inc | 548 Market St PMB 90114, San Francisco CA 94104