Navigation

Phaser Development Update (December 2019)

Published on 16th December 2019

(image above available here)

Greetings everyone! I wanted to post a quick update to let you know what I’ve been working on for the past couple of weeks and also to explain where the November Backers Pack is.

If you’ve been keeping an eye on commits to the Phaser repo, you’ll no doubt have seen a flurry of activity surrounding work taking place with Matter Physics. Matter is easily one of the most misunderstood aspects of Phaser. Conceptually, it starts out pretty clear: a full-body physics system using verlet integration. Yet, because it’s a 3rd party library, the way it works is very different from the rest of Phaser. As soon as you need to move beyond the helping hands that the Phaser abstraction provides you, things get complex, quickly.

I knew this was the case. Mostly because of the large number of Matter related questions that appear on the forum and Discord. But also because whenever I tried to do something in Matter, that I assumed was quite straight forward, it had always taken far longer than expected and required lots of diving into the code and API Docs.

It has also always bothered me that it’s often really hard to tell what range of values certain functions required. For example, when setting gravity, the difference that increasing the value by just a fraction makes can actually be catastrophic on the simulation. If you wish to perform an action such as stacking a large number of bodies on top of each other, you have to then start adjusting values such as the Engine position iteration limits, in order to improve stability. When you do achieve this, if you then try dropping a body with a large mass in, everything can fall apart again. I’ve often spent a good while balancing here and there, increasing or decreasing something by 0.01, to get it to a point I was happy with - only to find that tweaking, say the gravity, would blow everything up (or, more usually, make bodies sink into one another before being propelled out of the world at high speed.) And don’t even get me started on the impact that the position or length of a constraint can have.

This isn’t entirely Matter’s fault, of course. Physics is hard. Even so, it feels like there are a lot of magic numbers and settings involved. Getting a stable simulation is usually more the result of luck and duct-tape than having an actual understanding of what’s going on. The moment you want to get more complex, such as creating contact sensors or compound bodies, or performing ray intersection tests and so on, you are entirely on your own.

So my plan for the November Backers Pack was to create a whole bundle of examples and mini-tutorials focused around Matter. It was something a few of you had requested and honestly, it just made sense. Much like Alice in Wonderland, I jumped down the rabbit hole and had absolutely no idea how deep this journey would take me, or how mad it would get.

Things started out well. I coded new functions that would allow for really handy checks, such as “Is this point within a body?’”, “Which bodies intersect this rectangle?” And even “Which bodies intersect with this ray?”. I created some fun examples and while doing that realized that the Matter debug renderer needed some work. For example, it was impossible to render just constraints and not bodies. You couldn’t selectively render objects, either, it was pretty much a blanket ’on / off’ switch. This is why debug rendering has always been labeled as “Do not use in production”. However, that didn’t stop a number of you wanting to use it in production!

I ended up spending a few days recoding it from scratch. Now, you can specify your own Graphics object to render to. You can control, on a per body-part basis, what is drawn or not. You can now also set fill colors as well as stroke colors, line thickness, opacity, different colors for joints or springs and again all fully customizable per body. I also exposed all of the functions publicly. This means you can call, say, `renderBody` or `renderJoint` and pass in your own Graphics object to render to, allowing you to easily now use it in production without having to enable debugging across the whole of Matter.

It was going well. I was winning. Lots of useful new examples were being created and the pack was filling up. Because I try really hard to offer as many useful things as I can, I sketched out some ideas on an app that could prove useful for everyone. The concept was to make an app that would allow you to load an image and then have it automatically traced. The tracing would then be converted into polygons and a nice shiny ‘Preview’ button would let you see it immediately as a Matter body, right within the app. Finally, you could export the vertices data for your own games.

Thankfully, it didn’t take that long to get the initial app built. Tracing the image was painless and I found a really neat brand new Polygon Decomposition library to handle that part. I worked hard on the interface, too, building in features to let you swap between the vertices and polygon view modes, added in sliders so you can change the decomposition and tracing settings immediately and get a real-time view of the impact it has on the resulting polygon count. It was working beautifully until I added the Matter preview.

The first thing I noticed was that very often, especially in bodies consisting of multiple polygons (i.e. sub-bodies), the body wouldn’t center correctly with the texture. This is because a body in Matter is centered based on its mass, not its bounds. And the mass changes depending on the area of the polygons. It took me a while to figure out a solution for this and I had to modify the Matter API to get it to work consistently. All of a sudden, another few days had passed.

With textures aligning properly I figured it’d be fun to be able to throw the preview body around, so I dropped a Mouse Constraint into the scene. Simple enough, right? Except I noticed something very strange. The more vigorously you moved the pointer, the more the body vertices would drift away from its position. If you held the body carefully and ‘gently’, all was well. But move the mouse at any kind of speed and all of a sudden the sub-body vertices were actually out of alignment with the body position. This was a new one on me and not something I’d witnessed before. Which was frustrating because I didn’t have a clue where to start debugging it. Was it the vertices data that was wrong or the position? Perhaps it was the pointer data being sent too rapidly to the body, out of sync with the Matter Engine update? Perhaps it only manifested in a fixed-delta environment? I didn’t know and I didn’t like it.

I literally spent several days trying to work through it. I implemented the whole of the Matter Runner system, to rule out delta issues. I recoded the Pointer Constraint system so that DOM Events were batched and only used during the Engine update and nothing about the body or constraints were changed during the input events themselves. It still didn’t fix it. Although, in hindsight, that was a change worth making anyway, so at least it felt like a small win.

Interestingly, the bug didn’t manifest if the body was simple. A simple body in Matter has just one convex polygon and no children. A good example is a rectangle or a triangle. If the body has one polygon, you can thrash it around with the mouse until your hands fall off and it never goes out of sync. Which at least eliminated one potential avenue. I spent hours coding in a `syncVerts` feature that would literally re-sync body vertices back to match its position at the end of the Engine update step. It worked and solved the problem at the cost of a lingering nasty-hack smell emanating from every line of code. It just didn’t feel right. The deeper issue had been side-stepped, not resolved.

Which is where I’m currently at today. I have got a neat bundle of examples put together, a really nice exclusive physics body tracing app for you and a whole raft of great updates to Matter built into Phaser 3.22, but this one lingering bug (and I genuinely feel it’s a significant one) casts its ugly shadow over everything. So my plan is this: I’m going to spend another day trying to isolate and fix whatever is causing this in the root API. If I can’t manage it, I need to just move on, finish putting the final touches towards the pack and get it published before Christmas descends upon us all.

Phaser Recording Studio Finished!

In other news, I’ve finished setting up my new recording studio in the office. I had been planning on this for a long time and finally got around to getting it all sorted. I rent an office in the town where I live which consists of 3 separate rooms (and a kitchen.) This may sound excessive for one person, as I’m the only one in the office, but because I live far from any big cities, in a small town in the middle of a forest, rent is incredibly cheap. Yes, it’s more of a geek den than a ‘serious’ office, but hey, indulge me. As I’ve plenty of space it allows me to do things I couldn’t if I worked from home - like build a mini studio.

For a long time now I’ve wanted to create Phaser tutorial videos. And not just a one-off either, but a proper series of them. While I’m happy with publishing written tutorials, let’s face it: games are visual things. It can make such a difference to see the code come alive on-screen. Sure, I could have just whacked Camtasia, or some other screen-recording app, on my PC and talked into my mic, but I just didn’t feel like it would have the right level of quality. I didn’t want you seeing all my desktop icons, or pop-up notifications of meetings, or people messaging me on Discord. I also run at a really high resolution on a 3-monitor set-up. Far too high to capture properly.

For me personally, I find that the most effective tutorial videos are those that present a consistent image. The presenter's desktop should be perfectly clean. There should be nothing going on but the code and the end result. No distractions, random icons or pop-ups. It should be almost surgical in its nature and look the same for every video in a series. Quite the opposite of what my working desktop is like.

So that is what I’ve been building. Through-out the year I’ve been picking up bits of hardware that I would need. A shock-mount here, a pop-filter there, a microphone boom arm, a dedicated PC for recording work and lots of hard drive space. Even a new monitor the right size so I can capture it full-screen and it fits an HD video perfectly, without any distortion. I finally finished setting it all up last week and I’m very happy with the result. I can now record tutorial videos cleanly and easily, from a single workstation, so they’re all consistent in how they look and sound. It’s a great position to be in. When I get back after the Christmas break I’ll begin recording. At first, the videos will be published to Patreon and overtime should build into a decent collection.

For now, though, I'm going to return to the trenches and battle it out with Matter.js once again. I sincerely hope I win.