There are three reasons why I never looked into adding "interpolated fixed timestep" style framerate-independence into GG2: (note: physics is synonymous with physics plus game logic)
1) Game maker.
2) When done with a lot of headroom, it increases the input latency floor. Instead of "framerate interval/2 plus frametime plus ticktime" average latency, you have "framerate interval/2 plus frametime plus tickrate interval plus ticktime" average latency. By design, ticktime is almost always higher than the framerate interval, because people use small physics framerates like 15 or 20.
3) You need to implement some kind of delta time anyway. Sometimes a user really cannot even run the physics simulation at 20 ticks per second.
I have no issues with updating GG2 to newer versions of game maker.
As for other platforms like python or C++, I made a toy game maker-like language in rust that I called gammakit:
https://github.com/wareya/gammakit but it's not gonna go anywhere unless someone gives me a million dollars or something.
It's not
actually difficult to make something resembling very early versions of GG2. What's hard is making something with support for a very wide range of available functionality so that you don't have to dig into and extend the platform while using it. Having to do that causes bad structural problems unless you are one of the very most skilled AND EXPERIENCED of programmers. This is what killed benetnasch/kotareci.
edit: I fucked up writing point 2 originally, so I rewrote the math explanation. If you're still confused, here:
think_time : time between simulation of a frame's physics/logic beginning and finishing
render_time : same but for rendering
think_interval : time from the beginning of one physics/logic tick to the beginning of the next physics/logic tick
render_interval : same but for rendering
sequential_interval : when you think -> render -> repeat, think_interval and render_interval are exactly this, and think_time and render_time add up to less than or equal to this
when you do things the way game maker currently does, you get this average latency:
think_time + render_time + sequential_interval/2
when you use interpolated fixed timestep, you get this instead:
think_time + render_time + render_interval/2 + think_interval