Planet Gamedev

Gamasutra Feature Articles

Road to the IGF: Kevin Regamey's Phonopath

January 30, 2015 09:41 PM

As part of our ongoing Road to the IGF interview series with nominees, Power Up Audio co-founder Kevin Regamey explains how he built the award-nominated audio puzzle game Phonopath and why. ...

Don't Miss: The making of Elite

January 30, 2015 08:56 PM

Elite co-creator David Braben shares the motivation behind and genesis of the genre-defining space flight sim, which stood against industry demands for another arcade-patterned game. ...

Get a job: Crystal Dynamics seeks a Senior Environment Artist

January 30, 2015 08:45 PM

The house responsible for the Tomb Raider reboot is looking to hire an experienced environment artist to work alongside the Rise of the Tomb Raider team in its Redwood City, CA office. ...

Video: How to build a healthy eSports community around your game

January 30, 2015 08:29 PM

As part of the GDC Next 2014 Community Summit, Jason Xu (CEO of eSports management platform Battlefy) presented developers with data and strategies for encouraging competitive play in their games. ...

A muse on Metroid II: A maze of murderscapes

January 30, 2015 08:12 PM

"I first played Metroid II: Return of Samus in the dimly lit women's wing of a homeless shelter in Providence, Rhode Island. Nestled in my palms like a religious text is the video game machine." ...

Joystiq and Massively shut down amid AOL downsizing

January 30, 2015 08:00 PM

The rumors are true: Venerable games media outlet Joystiq is closing up shop alongside its sister site Massively and a host of other AOL-owned websites. Many people have lost their jobs. ...

Analysis: Hardware revisions not stopping the decline of the 3DS business

January 30, 2015 07:56 PM

Gamasutra analyst Matt Matthews shows in his new post that the 3DS business has been in steady decline: "the continued erosion of their active hardware base is a serious issue," he writes. ...

Shake-up at Sega sees up to 300 jobs cut and an office shuttered

January 30, 2015 07:07 PM

Japanese company Sega Sammy announced today that it plans to restructure the company in a way that will see roughly 300 jobs (or more) cut and Sega of America's San Francisco office shuttered. ...

Analysis: Nintendo's turned around its digital business

January 30, 2015 06:39 PM

According to Gamasutra analyst Matt Matthews, "Nintendo is finally all-in on digital sales," and he here provides a historical analysis of the company's performance in the digital download realm. ...

Catch Croteam's Talos Principle postmortem at GDC

January 30, 2015 05:06 PM

GDC officials confirm Alen Ladavec and Davor Hunski, CTO and CCO (respectively) of Croteam, are delivering a postmortem of development on their IGF award-nominated game The Talos Principle at GDC 2015. ...

How we created the video for our successful Kickstarter

January 30, 2015 04:59 PM

"The video was the thing that we feel was most important to the success of our campaign and is also the place that we see otherwise great games trip and fall down on." ...

Building the booth: A PAX South indie exhibitor postmortem

January 30, 2015 04:18 PM

"With this post I want to do a postmortem and share some behind the scenes footage on what's it like trying to wing a 'professional indie setup' during a major games convention." ...

Lessons learned from localizing our game in 10 languages

January 30, 2015 04:05 PM

"It was so painful and tedious and boring which obviously led to mistakes and led to more time spent in QA. We could have made it more interesting, quicker and less of manual work if we knew some tricks." ...

Kickstarter in 2014: The breakdown

January 30, 2015 03:12 PM

A detailed analysis of the Kickstarter platform in 2014 -- and a particular look at how game projects did. Full numbers inside. ...

Get an inside look at The Long Dark and This War Of Mine at GDC

January 30, 2015 03:10 PM

GDC officials highlight great GDC 2015 talks from the developers of This War Of Mine and The Long Dark about the art of making great indie games. ...

Code Corner

Zero-byte BVH

by admin at January 30, 2015 02:28 PM

I wrote this last year. Enjoy. http://www.codercorner.com/ZeroByteBVH.pdf

Gamasutra Feature Articles

Cook, Serve, Delicious, the final chapter: Sales, crunch, and success

January 30, 2015 01:34 PM

David Galindo returns after a year with the sixth installment of dev-log for Cook, Serve, Delicious, including Humble Bundle and Steam sales numbers, and how his free updates were created (and performed). ...

Gamasutra Feature Articles

Twitch heats up: In 2014, it had 100 million unique viewers per month

January 29, 2015 11:16 PM

Developers, take note: The premier video game streaming site has hit some impressive milestones, and it's more than likely that your audience is waiting for you there. ...



Nintendo sells 5.7 million Amiibos and plenty of games, but it's no relief

January 29, 2015 10:56 PM

While Nintendo's stock price has slid following its latest financial results, reports are pouring in about how well its games and new Amiibo toys are doing. ...

Get a job: Hangar 13 seeks a Graphics and Rendering Engineer

January 29, 2015 10:44 PM

2K Games' newest studio, the Novato-based Hangar 13, is hiring a seasoned graphics programmer to work on researching, developing and debugging graphics tech in its Novato, CA offices. ...

Stalled Ouya just landed $10 million investment, report claims

January 29, 2015 09:49 PM

The Wall Street Journal reports that anonymous sources claim Alibaba is considering integrating the Ouya platform (and its catalog of games) into its own line of set-top boxes. ...

How Crossy Road made $1 million from video ads

January 29, 2015 09:31 PM

After topping a million downloads and $1M in ad revenue in its first month of release, Crossy Road co-creator Andrew Sum breaks down the award-winning mobile game's unobtrusive monetization design. ...

Road to the IGF: Justin Smith's Desert Golfing

January 29, 2015 08:38 PM

Continuing our Road to the IGF series of interviews with nominees, we speak to Desert Golfing creator Justin Smith about his Nuovo Award-nominated surprise hit, an infinite and improbable existential cruise. ...

Don't Miss: Self-promotion for game developers

January 29, 2015 07:33 PM

Industry vet Raph Koster (Ultima Online) explains in this post how self-promotion is neither dishonest nor tacky -- and explains how you should step up and take credit for your work. ...

Sleep well, come prepared, collaborate: Lessons from the Global Game Jam

January 29, 2015 07:18 PM

"This year I honestly felt like I didn't learn much about the direct process of making a game, but I learned a lot more relating to everything else that plays a supporting part in game development." ...

Nintendo turns YouTubers into partners via new affiliate program

January 29, 2015 06:52 PM

The company launched a beta version of the Nintendo Creators Program this week, and it affords developers some interesting insight into how one of the biggest names in games views YouTubers. ...

Creating a user-generated chain story: A Global Game Jam postmortem

January 29, 2015 06:18 PM

Lessons learned when one developer worked through his fears to create Escalate! -- a collaborative storytelling game where the next person to pick up the game branches its story and adds their own touch. ...

UX insights: How King is shaping the future of match-3

January 29, 2015 05:28 PM

A user experience designer takes a close look at how King is shaping the future of match-3 UI/UX, from gameplay to purchase options -- with its soft-launched Pepper Panic Saga. ...

Successful game makers share indie biz tips at GDC 2015

January 29, 2015 03:12 PM

Speakers from Double Fine, Ouya, Devolver Digital, TinyBuild, Finji and more will be sharing their lessons learned as successful indie game makers during GDC 2015. ...

Trailer trouble: Dealing with pixel-art backlash

January 29, 2015 12:00 PM

Two days ago, we released our first "big" trailer. It got a lot of views, but a very vocal minority downvoted it, and got extremely mad about our use of pixel art. Here are some of our learnings from the experience. ...

c0de517e Rendering et alter

Notes on G-Buffer normal encodings

by DEADC0DE (noreply@blogger.com) at January 29, 2015 10:03 AM

Tonight on Masterchef, we'll try to cook a G-Buffer encoding that is:
  • Fast when implemented in a shader
  • As compact as possible
  • Makes sense under linear interpolation (hardware "blendable", for pixel-shader based decals)
    • So we assume no pixel sync custom blending. On a fixed hardware like console's GPUs it would be probably possible to sort decals to not overlap often in screen-space and add waits so read-write of the same buffer never generates data races, but it's not easy.
  • As stable as possible, and secondarily as precise as possible.

http://blog.geogarage.com/2013/03/what-are-seven-seas.html

Normal encodings are a well studied topic, both in computer science and due to their relation with sphere unwrapping and cartographic projections of the globe. Each of the aforementioned objectives, taken singularly, is quite simple to achieve.

Nothing is faster than keeping normals in their natural representations, as a three component vector, and this representation is also the one that makes most sense under linear interpolation (renormalizing afterwards of course). 
Unfortunately storing three components means we're wasting a lot of bits, as most bit combinations will yield something that is not a normal vector, so most encodings are unused. In fact we're using just a thin surface of a sphere of valid encodings inside a cubic space of minus one to one components (in 32bit floating point we'll get about 51bits worth of normals out of the 3x32=96bits storage).

If we wanted to go as compact as possible, Crytek's "best fit" normals are one of the best possible representations, together with the octahedral projection (which has a faster encoding and almost the same decoding cost).
Best fit normals aren't the absolute optimal, as they chose the closest representation among the directions contained in the encoding cube space, so they are still constrained to a fixed set of directions, but they are quite great and easy to implement in a shader.
Unfortunately these schemes aren't "blendable" at all. Octahedral normals have discontinuities on half of the normal hemisphere which won't allow blending even if we considered the encoding square to wrap around in a toroidal topology. Best fit normals are all of different lengths, even for very close directions (that's they key for not wasting encode space), so really there is no continuity at all.

Finally, stability. That usually comes from world-space normal encodings, on the account that most games have a varying view more often than they have moving objects with a static view. 
World-space two-component encodings can't be hardware "blendable" though, as they will always have a discontinuity on the normal sphere in order to unwrap it.
Object-space and tangent-space encodings are hard because we'll have to store these spaces in the g-buffer, which ends up taking encode space
.

View-space can allow blending of two-component encodings by "hiding" the discontinuity in the back-faces of objects so we don't "see" it, but in practice with less than 12 bits per component you end up seeing wobbling on very smooth specular objects as not only normals "snap" into their next representable position as we strafe the camera, but there is also a "fighting" of the precise view-to-world transform we apply during decoding with the quantized normal representation. As we don't have 12bit frame-buffer formats, we'll need to use two 16-bit components, which is not very compact.

So you get quite a puzzle to solve. In the following I'll sketch two possible recipes to try to alleviate these problems.

- Improving view-space projection.

We don't have a twelve bits format. But we do have a 10-10-10-2 format and a 11-11-10 one. The eleven bits floats don't help us much because of their uneven precision distribution (and with no sign bit we can't even center the most precision on the middle of the encoding space), but maybe we can devise some tricks to make ten bits "look like" twelve.

In screen-space, if we were ok never to have perfectly smooth mirrors, we could add some noise to gain one bit (dithering). Another idea could be to "blur" normals while decoding them, looking at their neighbours, but it's slow and it would require some logic to avoid smoothing discontinuities and normal map details, so I didn't even start with that.

A better idea is to look at all the projections and see how they do distribute their precision. Most studies on normal encodings and sphere unwrapping aim to optimize precision over the entire sphere, but here we really don't want that. In fact, we already know we're using view-space exactly to "hide" some of the sphere space, where we'll place the projection discontinuity.

It's common knowledge that we need more than a hemisphere worth of normals for view-space encodings, but how much exactly, and why? 
Some of it is due to normal-mapping, that can push normals to face backwards, but that's at least questionable, as these normal-map details should have been hidden by occlusion, where they an actual geometric displacement. 
Most of the reason we need to consider more than an hemisphere is due to the perspective projection we do after view-space transform, which makes the view-vector not constant in screen-space. We can see normals around the sides of objects at the edges of the screen that point backwards in view-space.

In order to fight this we can build a view matrix per pixel using the view vector as the z-axis of the space and then encode only an hemisphere (or so) of normals around it. This works, and it really helps eliminating the "wobble" due to the fighting precisions of the encode and view-space matrix, but it's not particularly fast.

- Encode-space tricks.

Working with projections is fun, and I won't stop you from spending days thinking about them and how to make spherical projections work only on parts of the sphere or how to extend hemispherical projections to get a bit more "gutter" space, how to spend more bits on the front-faces and so on. Likely you'll re-derive Lambert's azimuthal equal-area projection a couple of times in the process, by mistake.

What I've found interesting though after all these experiments, is that you can make your life easier by looking at the encode space (the unit square you're projection the normals to) instead, starting with a simple projection (equal-area is a good choice):
  1. You can simply clip away some normals by "zooming" in the region of the encode space you'll need. 
  2. You can approximate the per-pixel view-space by shifting the encode space so per-pixel the view-vector is its center.
    • Furthermore, as your view vector doesn't vary that much, and as your projection doesn't distort these vectors that much, you can approximate the shift by a linear transform of the screen-space 2d coordinate...
  3. Most projections map normals to a disk, not the entire square (the only two I've found that don't suffer from that are the hemispherical octahedral and the polar space transform/cylindrical projection). You can then map that disk to a square to gain a bit of precision (there are many ways to do so).
  4. You can take any projection and change its precision distribution if needed by distorting the encode space! Again, many ways to do so, I'd recommend using a piecewise quadratic if you go that route as you'll need to invert whatever function you apply. You can either distort the square by applying a function to each axis, or do the same shifting normals in the disc (easy with the polar transform).
    • Note that we can do this not only for normal projections, but also for example to improve the sampling of directions in a cubemap, e.g. for reflections...
Lambert Equal Area

Same encode, with screen-space based "recentering"
Note how the encoding on the flat plane isn't constant anymore

- Encoding using multiple bases.

This recipe comes from chef Alex Fry, from Frostbite's kitchen. It was hinted at by Sebastien Lagarde in his Siggraph presentation but never detailed. This is how I understood it, but really Alex should (and told me he will) provide a detailed explanation (which I'll link here eventually).

When I saw the issues stemming from view-space encodings one of my thoughts was to go world-space. But then you reason a second and realize you can't, because of the inevitable discontinuities. 
For a moment I stopped thinking about a projection that double-covers rotations... But then you'll need to store a bit to identify which of the projections you are using (kinda like dual parabolic mapping for example) and that won't be "blendable" either. 
We could do some magic if we had wrapping in the blending but we don't, and the blending units aren't going to change anytime soon I believe. So I tossed this away.

And I was wrong. Alex's intuition is that you can indeed use multiple projections and you'll need to store extra bits to encode which projection you used... But, these bits can be read-only during decal-blending! 
We can just read which projection was used and project the decal normals in the same space, and live happily ever after! The key is of course to always chose a space that hides your projection discontinuities.


2-bits tangent spaces

3-bits tangent spaces

4-bits tangent spaces
There are many ways to implement this, but to me one very reasonable choice would be to consider the geometric normals when laying down the g-buffer, and chose from a small set of transforms the one with its main axis (the axis we'll orient our normal hemisphere towards during encoding) closest to the surface normal.

That choice can then be encoded in a few bits that we can stuff anywhere. The two "spare" bits of the 10-10-10-2 format would work, but we can "steal" some bits from any part of the g-buffer (even maintaining the ability to blend that component, by using the most significant bits, reading them in (as we need for the normal encoding anyways) and outputting that component remaining bits (that need to be blended) while keeping the MSB the same across all decals.

I'll probably share some code later, but a good choice would be to use the bases having one axis oriented as the coordinates of the vertices of a signed unitary cube (i.e. spanning minus to plus one), for which the remaining two tangent axes are easy to derive (won't need to re-orthonormalize based on an up vector as it's usually done).

Finally, note how this encode gives us a representation that plays nicely with hardware blending, but it's also more precise than a two-component 10-10 projection if done right. Each encoding space will in fact take care of a subset of the normals on a sphere, namely all the normals closest to its main axis. 
Thus, we know that the subsequent normal projection to a two component representation have to take care only of a subset of all possible normals, and more spaces (and more bits) we have less normals we'll have to encode for each. 
So, if we engineer the projection to encode only the normals it has to, we can effectively gain precision as we add bits to the space selection. 

Conversely beware of encoding more normals than needed, as that can create creases as you move from one space to another on the sphere, when applying normal maps, as certain points will have more "slack" space to encode normals and some others less, you'll have to make sure you won't end up pushing the normals outside the tangent-space hemisphere at any surface point.

- Conclusions & Further references:

Once again we have a pretty fundamental problem in computer graphics that still is a potential source of so much research, and that still is not really solved.
If we look carefully at all our basic primitives, normals, colours, geometry, textures and so on, there is still a lot we don't know or we do badly, often without realizing the problems.
The two encoding strategies I wrote of are certainly a step forward, but still in 10 bits per components you will see quantization patterns on very smooth materials (sharp highlights, mirror-like reflections are the real test case for this). And none of the mappings is optimal, there are still opportunities to stretch and tweak the encoding to use its bits better. 
Fun times ahead.


A good test case with highly specular materials, flat surfaces, slowly rotating objects/view

Gamasutra Feature Articles

Which of your new users are about to churn?

January 29, 2015 09:01 AM

"The most significant churn occurs right after install regardless if it's on a mobile device or on the web. The first hours, not to say minutes, of game play, are critical." ...

Horror in the Making: How Red Barrels outlasted Outlast

January 29, 2015 09:00 AM

Development of Red Barrels' Outlast was as intense -- and at times as horrifying -- as the game itself. Studio co-founder Philippe Morin writes about the ups, downs, and ultimate success of the survival horror game. ...

Game From Scratch

Godot Engine Tutorial Part 3 -- Program Lifecycle and Input Handling

by Mike@gamefromscratch.com at January 29, 2015 03:10 AM

 

This tutorial is available in video form here or embedded below. 

 

Now might be a good time to pause and look at the life cycle of a typical program, as this can be a bit confusing with Godot yet is something you really need to understand.  Every non-trivial game has a game loop somewhere.  This is the code that runs after a program starts and is basically the heart of your application.  In pseudo code, it might look something like this:

 

main() {
   
   setupApplication()
   scene = createScene()
   
   while(!quit){
      get_input()
      update_physics()

      scene.updateAllChildren()
      scene.render()
   }
}

At it's heart it’s a glorified loop that runs over and over, checking for input, updating the scene and rendering the results until told to stop.

 

Godot of course is no exception, although by default this behavior is hidden from you as is the norm with game engines.  Instead the object that owns your scene is a SceneTree which itself inherits the MainLoop, which provides the above functionality.  A default one is provided for you, but if you wish you can implement your own, more on that below. 

 

What you should however realize is that this SceneTree is the beating heart of your application.  Every frame it calls it’s active scene passing in all the input that has occurred, as well as updating nodes that request updating.  We will look at this process now.  One important thing to be aware of… Nodes can access the SceneTree using the method .get_tree().

 

Updating a Node every frame

 

Ok, so that’s the basics of how program execution flows, now let’s take a look at a more practical example.  Let’s say for example we have a Sprite that we want to update every frame.  How do we tell our MainLoop that we want or Node to be updated?  Fortunately it’s quite simple. 

 

Create a Sprite node, add a graphic, position it on the screen then add a new script to it.  All of this was covered in the previous tutorial if you are unsure how to proceed.

 

Now that we have a Script attached, first we need to tell it we want to receive updates.  That is, every iteration of the main loop, we want our script to be called.  This is a two part process, pun not intended… much.  First, in your _ready function, you tell Godot you want to receive updates by called set_process(true).  Then you override the virtual function _process().

 

Let’s take a look at a simple sprite that moves right until it hits the edge of the screen, at which point it wraps around.

extends Sprite


func _ready():
   self.set_process(true)
   
func _process(delta):
   var cur_pos = self.get_pos()
   cur_pos.x += 100 * delta
   
   # wrap around screen
   if(cur_pos.x > self.get_viewport_rect().size.width + self.get_item_rect().size.width/2):
      cur_pos.x = -self.get_item_rect().size.width/2
   self.set_pos(cur_pos)

set_process tells Godot to call this nodes _process() function.  The value passed in, delta, is the elapsed amount of time since the last time _process was called.  As you can see in the above example, this value can be used to animate at a constant rate.  The above example will update the X value by 100 pixels per second.  Your end result should look something like this:

 

Video_2015-01-27_105150

 

So, in a nutshell, if your want to handle updates in your Node derived object, you simply call set_process(true) and provide a _process(float) override.

 

Handling Input by Polling

 

That moves us on to handling input.  You will notice that Input and Process handling are very similar.  There are a couple ways you can handle input in Godot.  Let’s start with the easiest, polling.

 

You can poll input at any time using the global object Input, like so:

func _process(delta):
   if(Input.is_key_pressed(KEY_ESCAPE)):
      if(Input.is_key_pressed(KEY_SHIFT)):
         get_tree().quit()

 

This checks  first if the ESCAPE key, then if the SHIFT key (is also!) pressed.  If so we tell the SceneTree to exit the application.  As I said earlier, a node can access it’s SceneTree using get_tree().

 

In addition to polling for keyboard, there are also methods is_joy_button_pressed(), is_mouse_button_pressed() and is_action_pressed() which will make more sense in the near future.  You can also poll for status.  For example, to check the mouse cursor or touch location you could:

 

func _process(delta):
   if(Input.is_mouse_button_pressed(BUTTON_LEFT)):
      print(str("Mouse at location:",Input.get_mouse_pos(), " moving at speed: ", Input.get_mouse_speed()));

There are other inputs you can poll as well, mostly mobile based, but they all use a very similar interface. I will cover mobile specific controls at a later point in this series.

 

Handling Input as Event Driven

 

You can also have Godot hand your application all the Input events as they occur and choose what to process.  Just like handling updates, you have to register to receive input events, like so:

func _ready():
   set_process_input(true)

 

Then you override the function _input, which takes an InputEvent as a parameter.

func _input(event):
   # if user left clicks
   if(event.type == InputEvent.MOUSE_BUTTON):
      if(event.button_index == 1):
         self.set_pos(Vector2(event.x,event.y)) 
         
   # on keyboard cursor key
   if(event.type == InputEvent.KEY):
      var curPos = self.get_pos()
      
      if(event.scancode == KEY_RIGHT):
         curPos.x+= 10
         self.set_pos(curPos)

      if(event.scancode == KEY_LEFT):
         curPos.x-= 10
         self.set_pos(curPos)

 

The above example handles a couple different scenarios.  First if the user clicks the left button, we set the position to the mouse’s current x and y location, as passed in by the InputEvent class.  Notice in this case I tested for button by index instead of the define BUTTON_LEFT like earlier.  There should be no functional difference, although this would allow you to test for buttons for which a mapping isn’t defined, such as one of those insane 12 button mouse.  Next we check to see if the event is a KEY event and if it is we check which key.  In the event of the right or left arrow, we update our position accordingly.

 

Sometimes however when you handle an event you want it done and gone.  By default all events will continue to be broadcast to all event receivers.  When you don’t want this behavior, it’s fairly simple to tell Godot that an event is handled.  From the above example, let’s swallow the event in the case of it being an InputEvent.KEY.  This means only this class will have access to keyboard events ( well… and GUI controls, which actually get their crack at the events earlier on ).

   # on keyboard cursor key
   if(event.type == InputEvent.KEY):
      self.get_tree().set_input_as_handled()
      var curPos = self.get_pos()

 

Calling set_input_handled() will cause the InputEvent to propagate no further.

 

Finally it’s possible you want to do a catch all.  Perhaps you want to log all of the unhandled input events that occurred.  This can also be done and you also have to register for this behavior, like so:

func _ready():
   set_process_unhandled_input(true)

Then you simply override the corresponding function:


func _unhandled_input(event):
   print(str("An event was unhandled ",event))

In this case we simply log the event out to the console.  Warning though, there will be A LOT of them.  Just no keyboard events, since we are now eating those!

 

Input Maps

 

Quite often you want several commands to perform the same action.  For example you want pushing right on the controller d-pad to perform the same action as pushing the right arrow key.  Or perhaps you want to let the user define their own controls?  In both of these cases and input alias system is incredibly useful… and thankfully Godot has one built in… InputMaps.

 

You may have noticed the InputMap tab when we were in Project Settings earlier… if not open up Project Settings now…

image

 

Here you can see a number of mappings have already been defined for UI actions.  Lets go ahead and create a map of our own, MOVE_RIGHT.

At the top in Action enter MOVE_RIGHT

image

Then click Add

image

A new entry will be added to the bottom of the page, like so:

image

Click the + icon and add a new mapping of type Key

image

 

You will then be prompted to press a key:

image

 

Repeat this process and instead select another device… im going to also map the right mouse button, like so:

image

 

Your Input Map should now look something like this:

image

 

Now click the Save button and close the dialog. 

Now in code you can easily check activity using the input map, like so:

func _process(delta):
   if(Input.is_action_pressed("MOVE_RIGHT")):
      var cur_pos = self.get_pos()
      cur_pos.x += 1
      self.set_pos(cur_pos)

This code will run if either condition is true… the Right key is pressed or the Right mouse button is.  The above example is polled, but it’s just as easy to use an InputMap with event driven code, like so:

func _input(event):
   
   if(event.is_action("MOVE_RIGHT")):
      self.set_pos(Vector2(0,0))

 

One warning here however…  Actions are more states ( as in On or Off ) than they are Events, so it probably makes a great deal more sense dealing with them the former ( polling ) than the later ( event driven ).

 

A Peek Behind the Curtain

 

If you are like me, you probably aren’t content with not knowing exactly what is going on behind the scenes.  Black boxes just aren’t my thing and this is one of the great things about Godot being open-source, there are no black boxes!  So if you want to understand exactly how program flow works, it helps to jump into the source code.

 

THIS IS COMPLETELY OPTIONAL!

I figured I would bold that.  The following information is just for people that want to understand a bit more what is happening behind the scenes…  We are going to hunt down the actual main loop in the source code, and essentially it’s right here in the function main/main.cpp.  Specifically the method iteration() is effectively the main loop:

bool Main::iteration() {

   uint64_t ticks=OS::get_singleton()->get_ticks_usec();
   uint64_t ticks_elapsed=ticks-last_ticks;

   frame+=ticks_elapsed;

   last_ticks=ticks;
   double step=(double)ticks_elapsed / 1000000.0;

   float frame_slice=1.0/OS::get_singleton()->get_iterations_per_second();

   if (step>frame_slice*8)
      step=frame_slice*8;

   time_accum+=step;

   float time_scale = OS::get_singleton()->get_time_scale();

   bool exit=false;


   int iters = 0;

   while(time_accum>frame_slice) {

      uint64_t fixed_begin = OS::get_singleton()->get_ticks_usec();

      PhysicsServer::get_singleton()->sync();
      PhysicsServer::get_singleton()->flush_queries();

      Physics2DServer::get_singleton()->sync();
      Physics2DServer::get_singleton()->flush_queries();

      if (OS::get_singleton()->get_main_loop()->iteration( frame_slice*time_scale )) {
         exit=true;
         break;
      }

      message_queue->flush();

      PhysicsServer::get_singleton()->step(frame_slice*time_scale);
      Physics2DServer::get_singleton()->step(frame_slice*time_scale);

      time_accum-=frame_slice;
      message_queue->flush();
      //if (AudioServer::get_singleton())
      // AudioServer::get_singleton()->update();

      fixed_process_max=MAX(OS::get_singleton()->get_ticks_usec()-fixed_begin,fixed_process_max);
      iters++;
   }

   uint64_t idle_begin = OS::get_singleton()->get_ticks_usec();

   OS::get_singleton()->get_main_loop()->idle( step*time_scale );
   message_queue->flush();

   if (SpatialSoundServer::get_singleton())
      SpatialSoundServer::get_singleton()->update( step*time_scale );
   if (SpatialSound2DServer::get_singleton())
      SpatialSound2DServer::get_singleton()->update( step*time_scale );


   if (OS::get_singleton()->can_draw()) {

      if ((!force_redraw_requested) && OS::get_singleton()->is_in_low_processor_usage_mode()) {
         if (VisualServer::get_singleton()->has_changed()) {
            VisualServer::get_singleton()->draw(); // flush visual commands
            OS::get_singleton()->frames_drawn++;
         }
      } else {
         VisualServer::get_singleton()->draw(); // flush visual commands
         OS::get_singleton()->frames_drawn++;
         force_redraw_requested = false;
      }
   } else {
      VisualServer::get_singleton()->flush(); // flush visual commands
   }

   if (AudioServer::get_singleton())
      AudioServer::get_singleton()->update();

   for(int i=0;i<ScriptServer::get_language_count();i++) {
      ScriptServer::get_language(i)->frame();
   }

   idle_process_max=MAX(OS::get_singleton()->get_ticks_usec()-idle_begin,idle_process_max);

   if (script_debugger)
      script_debugger->idle_poll();


   // x11_delay_usec(10000);
   frames++;

   if (frame>1000000) {

      if (GLOBAL_DEF("debug/print_fps", OS::get_singleton()->is_stdout_verbose())) {
         print_line("FPS: "+itos(frames));
      };

      OS::get_singleton()->_fps=frames;
      performance->set_process_time(idle_process_max/1000000.0);
      performance->set_fixed_process_time(fixed_process_max/1000000.0);
      idle_process_max=0;
      fixed_process_max=0;

      if (GLOBAL_DEF("debug/print_metrics", false)) {

         //PerformanceMetrics::print();
      };

      frame%=1000000;
      frames=0;
   }

   if (OS::get_singleton()->is_in_low_processor_usage_mode() || !OS::get_singleton()->can_draw())
      OS::get_singleton()->delay_usec(25000); //apply some delay to force idle time
   else {
      uint32_t frame_delay = OS::get_singleton()->get_frame_delay();
      if (frame_delay)
         OS::get_singleton()->delay_usec( OS::get_singleton()->get_frame_delay()*1000 );
   }

   int taret_fps = OS::get_singleton()->get_target_fps();
   if (taret_fps>0) {
      uint64_t time_step = 1000000L/taret_fps;
      target_ticks += time_step;
      uint64_t current_ticks = OS::get_singleton()->get_ticks_usec();
      if (current_ticks<target_ticks) OS::get_singleton()->delay_usec(target_ticks-current_ticks);
      current_ticks = OS::get_singleton()->get_ticks_usec();
      target_ticks = MIN(MAX(target_ticks,current_ticks-time_step),current_ticks+time_step);
   }

   return exit;
}

 

If you take a good look at that code, you'll notice beyond the complexity it's actually remarkably close to the pseudo code I started this post with. As I said, most game loops start looking pretty same-y over time. Now looking at the code you will notice a number of calls like this:

 

OS::get_singleton()->get_main_loop()->iteration( frame_slice*time_scale ))

These are callbacks in to the MainLoop we mentioned earlier. By default Godot implements one in C++ you can access here in core/os/main_loop.cpp:

 

#include "main_loop.h"
#include "script_language.h"

void MainLoop::_bind_methods() {

   ObjectTypeDB::bind_method("input_event",&MainLoop::input_event);

   BIND_CONSTANT(NOTIFICATION_WM_FOCUS_IN);
   BIND_CONSTANT(NOTIFICATION_WM_FOCUS_OUT);
   BIND_CONSTANT(NOTIFICATION_WM_QUIT_REQUEST);
   BIND_CONSTANT(NOTIFICATION_WM_UNFOCUS_REQUEST);
   BIND_CONSTANT(NOTIFICATION_OS_MEMORY_WARNING);

};

void MainLoop::set_init_script(const Ref<Script>& p_init_script) {

   init_script=p_init_script;
}

MainLoop::MainLoop() {
}


MainLoop::~MainLoop()
{
}



void MainLoop::input_text( const String& p_text ) {


}

void MainLoop::input_event( const InputEvent& p_event ) {

   if (get_script_instance())
      get_script_instance()->call("input_event",p_event);

}

void MainLoop::init() {

   if (init_script.is_valid())
      set_script(init_script.get_ref_ptr());

   if (get_script_instance())
      get_script_instance()->call("init");

}
bool MainLoop::iteration(float p_time) {

   if (get_script_instance())
      return get_script_instance()->call("iteration",p_time);

   return false;

}
bool MainLoop::idle(float p_time) {

   if (get_script_instance())
      return get_script_instance()->call("idle",p_time);

   return false;
}
void MainLoop::finish() {

   if (get_script_instance()) {
      get_script_instance()->call("finish");
      set_script(RefPtr()); //clear script
   }


}

 

The default main loop in turn is mostly a set of callbacks into the active script.  You can easily replace this MainLoop implementation with your own, either in GDScript or C++.    Simply pass your class name in the to main_loop_type value in Project Settings:

image

 

Granted, very few people are actually going to need to do this…  mostly the people that want to live entirely in C++ land.  I do however think it’s extremely valuable to understand what is going on behind the scenes in a game engine!

 

The Video

 

Gamasutra Feature Articles

Steam Workshop creators can now sell content for non-Valve games

January 29, 2015 01:07 AM

Valve has begun supporting paid, curated Steam Workshops for non-Valve games, starting with Chivalry: Medieval Warfare and Dungeon Defenders: Eternity. If your game is on Steam, this is a big deal. ...

Gamasutra Feature Articles

Valve's economist is now the finance minister of Greece

January 28, 2015 11:30 PM

Economist Yanis Varoufakis, known best in game development circles for his work as Valve's economist-in-residence, is now the new finance minister of Greece. ...



OpenGL

3D Box Shot Pro V4 Release

January 28, 2015 11:27 PM

Jellypie Software has release 3D Box Shot Pro V4, an OpenGL powered 3D rendering tool. The new version now feature 128 models per scene, collision detection between models, stacking models and copy and paste functionality. It’s now possible to build complex scenes quickly and easily. 3D Box Shot Pro V4 features an all new OpenGL rendering engine that can render 67mega pixel images in seconds.

OpenGL works in biicode, a C/C++ dependency manager

January 28, 2015 11:25 PM

OpenGL can now be used with biicode (a C/C++ deps manager, just like Maven and Maven Central for Java) in any C and C++ project from source code. Other people have been using it with pretty good results.

GLEW 1.12.0 released

January 28, 2015 11:23 PM

GLEW is a cross-platform (Windows, Linux, Mac, Unix) open-source C/C++ extension loading library for OpenGL. GLEW 1.12.0 fixes some bugs and adds support for new extensions.

Piccante 0.4 the hottest HDR library with OpenGL support

January 28, 2015 11:23 PM

The new version of Piccante, a C++11 multi-platform (windows, mac os x, and linux) open-source (MPL license v2.0) imaging library, is now out! The new version provides a better support for OpenGL 4.0 Core Profile and novel algorithms on the GPU: improved memory management, image operators, better image statistics via redux, exposure fusion, Drago and Reinhard tone mapping operators, push-pull, edge-aware filtering, etc.

Gamasutra Feature Articles

Nintendo returning to profits, but is still struggling to sell hardware

January 28, 2015 10:43 PM

Nintendo had a profitable quarter, and is on track for a full-year profit -- but the company is still fighting to make up lost ground amidst weak sales of the Wii U and 3DS year-on-year. ...

Get a job: Blizzard is looking to hire a VFX Artist

January 28, 2015 09:01 PM

The Hearthstone team seeks a visual effects (VFX) production artist to develop new effects, from concept through implementation, at Blizzard's Irvine, CA headquarters. ...

Hear game dev tips from Xbox frontman Phil Spencer at GDC

January 28, 2015 08:39 PM

Phil Spencer, head of Xbox and gaming at Microsoft, will be delivering a sponsored hour-long talk at GDC 2015 on how developers can best make games across the Xbox One and Windows 10. ...

Don't Miss: The top 10 weird children of games and neuroscience

January 28, 2015 07:53 PM

In this classic 2011 feature, indie game developer and former neuroscience researcher Erin Robinson takes a look at ten particularly interesting studies to see what they can teach us as game makers. ...

5 'small but important' tips gleaned from the Global Game Jam

January 28, 2015 07:37 PM

"Realizing these aspects of game jams helped us a lot, so maybe it will help you too," writes Donald Kooiker, Global Game Jam participant from two-person team Duckbridge. ...

Video: Analyzing the phenomenon that was TwitchPlaysPokemon

January 28, 2015 07:27 PM

Speaking at GDC Next 2014, game researcher Alex Leavitt offers a data-rich analysis of the TwitchPlaysPokemon phenomenon and what it means for game makers. ...

Finnish game company Rovio has lost its games chief

January 28, 2015 07:12 PM

Jami Laes, who up until recently served as Rovio's EVP of games, has left the company as it struggles to maintain its financial successes. ...

A clever way to handle the Global Game Jam theme, 'what do we do now'

January 28, 2015 06:16 PM

"Instead of taking the theme and trying to fit a game concept around it, we decided to literally apply the theme to the process of building the game." Find out how they did it! ...

Get practical production tips from an EA researcher at GDC 2015

January 28, 2015 05:15 PM

At GDC 2015, an EA researcher offers practical advice on how you can apply player research data to your late-stage efforts to tune your game and "move the needle" to hit that perfect sweet spot. ...

Game jams, and the difference between hacking and engineering

January 28, 2015 04:52 PM

Lessons learned when coding late at night on a tight deadline: "The issue of hackers vs. engineers is often framed as a question of identities, as a clash of cultures. I don't think that's healthy." ...

A diary of compromises: The process of making Girl's Flu for Global Game Jam

January 28, 2015 04:15 PM

One Global Game Jam participant looks back at the process involved in creating Girl's Flu, a 2D adventure game. ...

DirectX 12, PlayStation 4 sponsored sessions coming to GDC 2015

January 28, 2015 03:37 PM

GDC organizers highlight a few notable sponsored sessions from Sony, AMD and Autodesk on PlayStation 4 community features, making DirectX 12 work for your game and the tech of Destiny. ...

Road to the IGF: Drool's Thumper

January 28, 2015 11:15 AM

Marc Flury and Brian Gibson of Drool detail how they made the powerful audio of their IGF nominated game, Thumper. ...

10 tips for game jamming, fresh from the Global Game Jam

January 28, 2015 09:02 AM

After learning a lot last year, one jammer writes that "problems arose on all sides, different problems of the previous year." Here's what he learned this time around. ...

Game Design Deep Dive: The unique touchscreen action game controls of Helix

January 28, 2015 09:01 AM

Michael Brough, creator of games including 868-HACK and Corrypt, explains how he arrived at the unique control scheme of his iOS action game Helix. ...

Gamasutra Feature Articles

Perforce integrates P4D with Unity thanks to new open-source tool

January 27, 2015 11:28 PM

The popular version control package is now on the Unity store with a new, free, open-source integration tool. ...

Get a job: Monolith Productions seeks a Lead Designer, Narrative

January 27, 2015 11:00 PM

Award-winning Middle-earth: Shadow of Mordor devs are hiring a lead designer for the narrative on its next project -- to take the game's story from conception to execution. ...

Don't Miss: When AAA developers went indie to make Endless Space

January 27, 2015 06:55 PM

Amplitude, creators of the space-based 4X indie game Endless Space, reflect on the game's journey from conception in a 4th-floor Paris apartment (no elevator) in 2011 to its final 2012 release. ...

Unreal Engine 4 arrives on Ouya

January 27, 2015 06:26 PM

Unreal developers, take note -- you can now build Unreal Engine 4 games for the Ouya microconsole thanks to a new branch of the UE4 source code. ...

Making mobile games in the era of the the 30-second hook

January 27, 2015 05:53 PM

"We discovered is that the first 30 seconds are the most crucial. That became our most pressing concern -- how could we banish these preconceptions that our game was more than just a small experience on repeat?" ...