Planet Gamedev

Game AI for Developers

BROADCAST: Conference Live Stream (July 20th)

by Alex J. Champandard at July 05, 2015 09:00 AM

BROADCAST: Conference Live Stream (July 20th)

This upcoming broadcast on Monday, July 20th at 07:00 UTC will take place online within your browser using streaming audio/video:

“Join us live on July 20-22 for three days of Artificial Intelligence in Creative Industries, including presentations about character animation, agent behavior, analytics & data-science, crowds and ambient life, systemic design, procedural generation — and many more! ”

To subscribe for email reminders and check the exact time in your current timezone, visit this broadcast's page on

BROADCAST: Evolve: Building PvP Artificial Intelligence for Hunters and Monsters (July 9th)

by Alex J. Champandard at July 05, 2015 09:00 AM

BROADCAST: Evolve: Building PvP Artificial Intelligence for Hunters and Monsters (July 9th)

This upcoming broadcast on Thursday, July 9th at 19:00 UTC will take place online within your browser using streaming audio/video:

“This interview with Troy Humphrey's digs into the AI behind Evolve, from the underlying systems to the gameplay behaviors. Troy will share details about the technology and insights from development.”

To subscribe for email reminders and check the exact time in your current timezone, visit this broadcast's page on

c0de517e Rendering et alter

The following provides no answers, just doubts.

by DEADC0DE ( at July 04, 2015 09:08 PM

Technical debt, software rot, programming practices, sharing and reuse etcetera. Many words have been written on software engineering, just today I was reading a blog post which triggered this one.

Software does tend to become impervious to change and harder to understand as it ages and increases in complexity, that much is universally agreed on, and in general it's understood that malleability is one key measure of code and practices that improve or retain it are often sensible.

But when does code stop to be an asset and starts being a liability? For example when should we invest on a rewrite? 
Most people seem to be divided in camps on these topics, at least in my experience I’ve often seen arguments and entire teams even run on one conviction or another, either aggressively throwing away code to maintain quality or never allowing rewrites to capitalize on investments made in debugging, optimization and so on.

Smarter people might tell you that different decisions are adeguate for different situations. Not all code needs to be malleable, as we stratify certain layers become more complex but also require less change, new layers are the ones that we actively iterate upon and need more speed. 
Certainly this position has lots of merit, and it can be extended to the full production stack I’d say, including the tools we use, the operating systems we use and such things.

Such position just makes our evaluation more nuanced and reasonable, but it doesn’t really answer many questions. What is the acceptable level of stiffness in a given codebase? Is there a measure, who do we ask? It might be tempting just to look at the rate of change, where do we usually put more effort, but most of these things are exposed to a number of biases.

For example I usually tend to use certain systems and avoid others based on what makes my life easier when solving a problem. That doesn’t mean that I use the best systems for a given problem, that I wouldn’t like to try different solutions and that these wouldn’t be better for the end product. 
Simply though, as I know they would take more effort I might think they are not worth pursuing. An observer, looking at this workflow would infer that the systems I don’t use don’t need much flexibility, but on the contrary I might not be using them exactly because they are too inflexible.

In time, with experience, I’ve started to believe that all these questions are hard for a reason, they fundamentally involve people. 
As an engineer, or rather a scientist, one grows with the ideal of simple formula to explain complex phenomena, but people behaviour still seems to elude such simplifications.

Like cheap management books (are there any other?) you might get certain simple list of rules that do make a lot of sense, but are really just arbitrary rules that happened to work for someone (in the best case, very specific tools, worst just crap that seems reasonable enough but has no basis), they gain momentum until people realize they don’t really work that well and someone else comes up with a different, but equally arbitrary set of new rules and best practices.
Never they are backed by real, scientific data.

In reality your people matters more than any rule, the practices of a given successful team don’t transfer to other teams, often I’ve seen different teams making even similar products successfully, using radically different methodologies, and viceversa teams using the same methodologies in the same company managing to achieve radically different results.

Catering to a given team culture is fundamental, what works for a relatively small team of seniors won’t apply to a team for example with much higher turnover of junior engineers. 
Failure often comes from people who grew in given environments with given methodologies adapted to the culture of a certain team, and as that was successful once try to apply the same to other contexts where they are not appropriate.

In many ways it’s interesting, working with people encourages real immersion into an environment and reasoning, observing and experimenting what specific problems and specific solutions one can find, rather than trying to apply a rulebook. 
In some others I still believe it’s impossibile to shut that nagging feeling that we should be more scientific, that if medicine manages to work with best practices based on statistics so can any other field. I've never seen so far big attempts at making software development a science, deployed in a production environment. 

Maybe I'm wrong and there is an universal best way of working, for everyone. Maybe certain things that are considered universal today, really aren't. It wouldn't be surprising as these kinds of paradigm seem to happen in the history of other scientific fields.

Interestingly we often fill questionaries to gather subjective opinions about many things, from meeting to overall job satisfaction, but never (in my experience) on code we write or the way we make it, time spent where, bugs found where and so on...
I find amusing to observe how code and computer science is used to create marvels of technological progress, incredible products and tools that improve people’s lives, and that are scientifically designed to do so, yet often the way these are made is quite arbitrary, messy and unproductive.
And that also means that more often than not we use and appreciate certain tools we use to make our products but we can’t dare to think how they really work internally, or how they were made, because if we knew or focused on that, we would be quite horrified.

Procedural World

Export your creations

by Miguel Cepero ( at July 04, 2015 05:46 PM

We just completed a new iteration on the FBX export feature. This new version is able to bake textures along with the geometry. Check it out in this video:

The feature seems rather simple to the user, however there are massive levels of trickery going on under the hood.

When you look at a Voxel Farm scene  a lot of what you see is computed in realtime. The texturing of individual pixels happens in the GPU where the different attributes that make each voxel material are evaluated on the fly. If you are exporting to a static medium, like an FBX file, you cannot have any dynamic elements computed on the fly. We had no choice but to bake a texture for each mesh fragment.

The first step is to unwrap the geometry and make it flat so it fits a square 2D surface. Here we compute UV coordinates for each triangle. The challenge is how to fit all triangles in a square while minimizing wasted space and any sort of texture distortion.

Here is an example of how a terrain chunk is unwrapped into a collection of triangles carefully packed into a square:

The image also shows that different texture channels like diffuse and normal can then be written into the final images. That is the second and last step, but there is an interesting twist here.

Since we are creating a texture for the mesh anyway, it would be a good opportunity to include features not present in the geometry at the current level of detail. For instance, consider these tree stumps that are rendered using geometry at the highest level of detail:

Each subsequent level of detail will have less resolution. If we go ahead five levels, the geometric resolution won't be fine enough for these stumps to register. Wherever there was a stump, we may get now just a section of a much larger triangle.

Now, for the FBX export we are allocating unique texture space for these triangles. At this level of detail the texture resolution may still be enough for the stumps to register. So instead of evaluating the texture for the low resolution geometry, we project a higher resolution model of the same space into the low detail geometry. Here you can see the results:

Note how a single triangle can contain the projected image of a tree stump.

This process is still very CPU intensive as we need to compute higher resolution versions for the low resolution cells. This iteration was mostly about getting the feature working and available for users. We will be optimizing this in the near future.

The algorithms used here are included in the SDK and engine source code. This sort of technique is called "Detail Transfer" or "Detail Recovery". It is also the cornerstone for a much better looking LOD system, as very rich voxel/procedural content can be captured and projected on top of fairly simple geometry.


OpenVZ Server Setup Notes – Ubuntu 14.04 LAMP (w/o M)

by Mike K at July 04, 2015 04:15 AM

For a side project, I’m using cheap server from these guys:

I’ve decided that since it’s for development, I’d rather use Apache instead of NgineX. NgineX is much better than Apache when it comes to memory usage and performance, but Apache is a little easier to organize thanks to .htaccess files. And since Ludum Dare runs and will continue to run Apache for a while, I’ve decided to make my life working on both projects a little simpler.

For my reference, the following are my setup notes for the server.

0. Nuking the server

The old NgineX install is now gone. Replaced with a fresh Ubuntu 14.04 OpenVZ image. I believe it’s the Ubuntu 14.04 Minimal image from here:

SSH’ing in, I need to remember to get the login from the control panel. I also specifically only allowed my own IP address to SSH in to the server, using the Remote Access Policy “Only Allowed IPs”.

Now we can begin.

1. Preamble

SSH in. I am groot.

apt-get update
apt-get dist-upgrade

To be able to add additional repositories, we need:

apt-get install python-software-properties

locale-gen en_US.UTF-8
export LANG=en_US.UTF-8

The former solves issues with add-apt-repository, as apparently UTF-8 hadn’t been configured yet.


NOTE: When we start adding launchpad repositories, we’ll eventually get an error like this when we run “apt-get update“:

W: GPG error: trusty Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY AM8147UI12ADUD

To solve that, grab the UID after NO_PUBKEY and feed it in to this command:

apt-key adv --keyserver --recv-keys AM8147UI12ADUD

apt-get update


2. Basic Apache and PHP Setup

The Ubuntu repository has Apache 2.4.7 and PHP 5.5. For the latest (2.4.12+ and 5.6+), we do this:

add-apt-repository ppa:ondrej/php5-5.6
apt-get install apache2 php5 php5-mysql

That covers the basic Apache+PHP configuration.

If you wanted to install MySQL Server, you’d do the following.

apt-get install mysql-server

I don’t need it (the host I’m using offers an external SQL server), but for reference that’s what you need to know.

3. Apache Configuration

nano /etc/apache2/apache2.conf

nano /etc/apache2/ports.conf

nano /etc/apache2/sites-enabled/000-default.conf

DocumentRoot /var/www/public   # Instead of /var/www/html

TODO: mod_CloudFlare

4. PHP PECL Packages

To use PECL packages, we need to install Pear and PHP5 Dev.

apt-get install php-pear php5-dev

pear config-set php_ini /etc/php5/apache2/php.ini

The last line will save you from manually adding things like “” to php.ini.

We can now use PECL.

4a. APCu

I’m a big fan of APCu. It lets me share data with other PHP processes with RAM.

pecl install apcu-beta

I’m using a low memory server (256 MB), so we should explicitly say how much memory to give APCu.

The default is 32 MB, which should be fine for now.

5. PHP Configuration (php.ini)

nano /etc/php5/apache2/php.ini

display_errors = on
memory_limit = 128M         # make this smaller
upload_max_filesize = 2M

5b. PHP OpCache


6. Restart Apache

Now that everything is installed, restart Apache.

/etc/init.d/apache2 restart

7. Git, SSH and Source Code

apt-get install git

Now, generate an SSH key. Pass-phrase?

cat ~/.ssh/

Copy the Public Key, and paste it to your SSH Keys configuration (GitHub/Bitbucket).

Move the placeholder website out of the www folder.

mv /var/www/html /var/

git clone the source repository with an SSH URI.

8. Remote Database

Given a Web Server and a Database Server on the same local network.

Using Database Server’s CPanel:

  • Add a Database.
  • Add a User.
  • Give user full permissions to the database.
  • Add the Web Server’s internal IP to the “Remote Database Access Hosts” list

Then from the Web Server:

  • In PHP code, reference the database by the internal HostName/IP of the Database Server instead of “localhost”.

9. Automatic Updates


nano /etc/apt/apt.conf.d/50unattended-upgrades

Unattended-Upgrade::Allowed-Origins {
//      "${distro_id}:${distro_codename}-updates";

Can enable downloading of general updates in addition to security updates by uncommenting.

nano /etc/apt/apt.conf.d/10periodic

APT::Periodic::Update-Package-Lists "1";
APT::Periodic::Download-Upgradeable-Packages "1";
APT::Periodic::AutocleanInterval "7";
APT::Periodic::Unattended-Upgrade "1";

Apparently if we create this file, this is a decent daily configuration (see Details).

10. Lockdown SSH

Figure out the local IP addresses of the server, and open sshd_config.


nano /etc/ssh/sshd_config

Add a ListenAddress for your LAN IP.


Reboot, and SSH will now only allow incomming SSH connections from the local network.


Git Notes on SSH Keys

by Mike K at July 03, 2015 10:52 PM

Dealing with SSH keys is confusing. Every machine you run should have a unique SSH key.

SSH keys typically consist of 2 files:

  • id_rsa – Your Private Key. Used ONLY on your local machine.
  • – Your Public Key. Give it to others (pub for public).


The names themselves don’t matter, so feel free to rename them. It’s what the files contain that’s important.

If you ever lose or have a key compromised, generate a new one. As long as we are using them for version control, they are perfectly disposable. Don’t forget to delete old keys from your GitHub and Bitbucket accounts!

Steps with ** typically only ever need to be done once per computer.

Step 1: Generating an SSH key **

Once you’ve generated a key, it can be used for multiple services (GitHub, Bitbucket, etc).

You can check if you have any keys installed by looking in the ~/.ssh directory.

ls -al ~/.ssh

The default names are “id_rsa” and ““.

To generate a key, use ssh-keygen:

ssh-keygen -t rsa -b 4096 -C ""

I’ll be keeping the default names (~/.ssh/id_rsa and ~/.ssh/

By default, when you set a pass-phrase, you will be prompted for it every time you access the remote repository. Phrases are strongly recommended, because security. However, this behaviour can be changed (see Appendix A and B).


Step 1b: Backup your Keys! (optional) **

This isn’t necessary, but now would be a good time to backup your keys.

Ideally you should have some real and proper way to backup your keys, but here’s my lazy way:

cd ~/.ssh
mkdir backup
cp id_rsa* backup

You’ll always be using the originals (~/.ssh/id_rsa), but in case you accidentally overwrite them, you have a copy in the /backup folder.

Step 2: Install Public Key on your services **

If you haven’t already, install xclip.

sudo apt-get install xclip

SSH Keys need to be copied exactly, so xclip handles your clipboard for you.

Copy your Public Key with this command:

xclip -sel clip < ~/.ssh/

Your clipboard will now contain your Public SSH key.

Next, go add the Public SSH Key to your accounts. Do this by pasting the clipboard in to the box provided.

For GitHub, you can find it under Settings/SSH Keys.

For details, see Step 4:

For Bitbucket, you can find it under Manage Account/SSH Keys.

For details, see Step 6:

Give the keys added to your accounts good names, something about the computer they belong to. That way it’s easier to know what machines they belong to if you ever need to generate new ones.

Step 3. Change remote’s from HTTPS to SSH

To login using your SSH key, you need to change the remote from an HTTPS URL to an SSH URI.

To check your remote’s, run the following command to list them:

git remote -v

  • HTTPS URLs typically begin with https://
  • SSH URIs typically begin with git@ (the user), and use a colon : to separate HOST and PATH, not a slash

On GitHub, to find out your repository SSH URI, click SSH below the clone URL box.

click on SSH for your SSH clone URI

click on SSH for your SSH clone URI

On Bitbucket, click the drop-down box beside the URI to change it.

change the clone URI to SSH


Once you’ve configured the SSH keys, you should always check-out using SSH URIs instead of HTTPS.

git clone

Since you probably didn’t do that, here’s how we can change the remote:

git remote set-url origin

Adjust the code above accordingly if you used Bitbucket instead of GitHub.


Step 4. Done…?

That’s actually it, assuming we don’t mind punching in our pass-phrase every time.

We do mind though.

Appendix A: ssh-agent (i.e. the temporary solution)

If we want to create a temporary shell that will remember the pass-phrase, use this command:

ssh-agent bash

Then to add the SSH key.

ssh-add ~/.ssh/id_rsa

Again, this is only temporary. When you invoke exit, the pass-phrase will be forgotten.

Depending on the Linux configuration, doing ssh-add outside the ssh-agent shell may actually remember the pass-phrase permanently. But if you’re like me, running current Ubuntu’s, that wont cut it anymore.


Appendix B: SSH config (i.e. the permanent solution)

If it doesn’t already exist, create a file ~/.ssh/config

Add these lines to the file.

    User git
    IdentityFile ~/.ssh/id_rsa
    User git
    IdentityFile ~/.ssh/id_rsa


The first time you attempt to SSH to either website (i.e. any time you “git push” or “git pull“), you’ll be prompted for your pass-phrase. After entering it once, you shouldn’t have to enter it again until you reboot.


Appendix C: SSH config explained

The Host line in the SSH config is actually a unique name given to an SSH host. SSH will do a pattern match against what you have listed in your config as Hosts. The Host is not necessarily the host name, which we override using the HostName command (in fact, we’re also overriding the User name here).

If you were to add a section like this:

Host github-custom
    User git
    IdentityFile ~/.ssh/id_rsa_customkey

then you can specify a different SSH key to be used. In this case, I’m assuming I have an additional key pair named “id_rsa_customkey” and ““. I would have to add the Public Key to my GitHub account to use it.

To use the custom host, I would have to modify my URI.

git clone git@github-custom:povrazor/dairybox.git

Notice that my URI is github-custom and not

The original SSH URIs work correctly because we specifically gave them the same Host as as the HostName. Trickery. :)

Appendix D: Permissions

In case your permissions get messed up, the default settings for Ubuntu 14.04 are:

sudo chmod 600 ~/.ssh/id_rsa
sudo chmod 644 ~/.ssh/
sudo chmod 644 ~/.ssh/known_hosts
sudo chmod 664 ~/.ssh/config
sudo chmod 700 ~/.ssh


Gamasutra Feature Articles

Segmentation for mobile game developers

July 03, 2015 05:59 PM

"Push Notifications are a great way to (re-)engage with users and reward those who have been very loyal or entice those who haven't played in a while to come back and rejoin the gameplay." ...

Examining organic tutorials

July 03, 2015 05:39 PM

"What makes organic tutorial design work so well is that the player is learning how these mechanics work via situations and challenges that the game is built around." ...


Git Notes on Combining Repositories

by Mike K at July 03, 2015 05:09 PM

Just some notes. Recently, I had to merge and organize 3 repositories as one, so here are the things I ended up doing.

Merging 2 repositories in to one with full history

cd path/to/project-b
git remote add project-a path/to/project-a
git fetch project-a
git merge project-a/master # or whichever branch you want to merge

In my case, the local repository (source) was project-a, and the public repository (destination) was project-b.

When I was finished merging, instead of git push -u I had to push it like so:

git push --set-upstream origin master

(where master was the branch I was targeting)


Accepting all merged changes

After doing the above, my repository was filled with conflicts. I didn’t care about the the remote changes, so I was able to just blanket accept my local changes.

git checkout --ours -- <paths>
# or
git checkout --theirs -- <paths>

In the above context, project-b (destination) is --ours, and project-a is --theirs. I used --theirs, as I wanted my local repository merged in to the public one.


Adding an empty branch

git checkout --orphan NEW_BRANCH_NAME
git rm -rf .

Now add/commit any change to keep it.


Adding Multiple Origins

I haven’t done this yet, but eventually I’ll need to push code in to two separate repositories on demand.


Gamasutra Feature Articles

Making design tools in Unity: Creating WorldShape

July 03, 2015 05:03 PM

"How does our WorldShape tool actually work? How might you create your own similar tools? I won't be able to explain every aspect of Unity, obviously, but I'll try to point you in the right direction." ...

Geeks3D Forums

Best E3 2015 AAA Titles Will Use Havok Tech

July 03, 2015 04:29 PM

Havok®, a leading provider of AAA game development technology, extends congratulations to all of the nominees of Game Critics Awards for “Best of E3 2015,” which includes several of Havok’s developer partners.  The nominees pushed the limits of...

Gamasutra Feature Articles

Epic Games leads GDC Europe's lineup of great sponsored talks

July 03, 2015 04:03 PM

Passes for GDC Europe 2015 are still available at a discounted Early Bird rate, and as the show draws nigh we're debuting a pair of sponsored sessions on UE4 and PowerVR that you'll want to check out. ...


OpenGL GPU-accelerates Adobe Illustrator CC 2015

July 03, 2015 01:38 PM

GPU-acceleration is the major performance enhancement to Adobe Illustrator CC 2015.  This SIGGRAPH paper Accelerating Vector Graphics Rendering using the Graphics Hardware Pipeline explains how OpenGL GPU-accelerates the PDF-based vector graphics rendering model used by Illustrator.  Attend the paper presentation at SIGGRAPH 2015 in Los Angeles.

Gamasutra Feature Articles

Blog: Letting our differences bring us together

July 03, 2015 11:06 AM

A rumination on games that satisfy different people: "I thought about other games I wished I could enjoy with my wife, who prefers a more methodical and thoughtful approach than I usually take." ...

Blog: Mobile free-to-play games are not evil

July 03, 2015 08:01 AM

"It's very common to hear from developers how evil, hurtful or abusive mobile games are. I don't think most of the comments reflect the reality." ...

Mod Mentality: How Tabletop Simulator was made to be broken

July 03, 2015 08:00 AM

Indie devs (and former modders) Jason Henry and Kimiko chat with Gamasutra about why they developed an "online tabletop sandbox" game aimed explicitly at modders, and what they learned along the way. ...

Gamasutra Feature Articles

'Indiependence Day' promotion shines light on dev struggles

July 02, 2015 09:54 PM

Indie dev consultant Dan Adelman has launched a campaign to help indies sell more games over the U.S. upcoming holiday weekend -- at full price. ...

The best in Japan: CEDEC shares 2015 award nominees

July 02, 2015 09:22 PM

CEDEC has shared the nominees for its annual awards across five categories, including popular games like Final Fantasy XIV. It's a peek into how Japan's industry works. ...

From cave painting to CryEngine: How game art developed

July 02, 2015 08:38 PM

Charting the evolution of in-game art and comparing it to the evolution of fine art: Going from primitive to polished, with deliberate discursions into the abstract. ...

Geeks3D Forums

(WebGL) Goo Technologies Terminator T-800

July 02, 2015 07:58 PM

Terminator T-800



Gamasutra Feature Articles

Get a job: Retro Studios is hiring a Character Artist

July 02, 2015 07:52 PM

Nintendo-owned game maker Retro Studios (Donkey Kong Country: Tropical Freeze) seeks to hire a character artist to do modeling, texturing and art design in Retro's Austin, TX office. ...

How the Xbox 360 'Red Ring of Death' cost Microsoft $1.15 billion

July 02, 2015 07:40 PM

In a podcast with IGN, former Xbox exec Peter Moore describes sitting across the table from Steve Ballmer explaining that the Xbox 360 was going to cost Microsoft at least an extra $1.15 billion. ...

Turning Snake into something more: How Snakebird evolved

July 02, 2015 07:07 PM

A walk through the art-style evolution and production of Snakebird, a charming game that sprung from the simple classic Snake to become something more. ...

Video: How Other Ocean made #IDARB (and you can too!)

July 02, 2015 07:00 PM

As part of the GDC 2015 Game Career Seminar's 'Micromortems' session, Other Ocean's Mike Mika and Frank Cifaldi playfully break down the design process of their hit multiplayer game #IDARB. ...

Don't Miss: 20 fun facts about hex grids

July 02, 2015 06:45 PM

In this light-herated, timeless post, a studio shares twenty "of the interesting (and obscure) facts we uncover in our quest to know everything we can about grids and their use in games." ...

Game From Scratch

Wireframe Rendering in Blender

by at July 02, 2015 06:12 PM


Sometimes when you are working with Blender you want to render a beauty shot, but want to also show the wireframe of the model you are working on.  This video shows that process.  (As an added bonus, it also shows how to enable wireframe display while in Object mode in case you were wondering how ).



The Process


Select the Object you wish to render with a wireframe overlay:



Hit Shift + D to duplicate it.



Select the material on the wireframe and make it unique.



Select a diffuse color for your wireframe.  Optionally enable emit if you want the wireframe to glow slightly.



Add a Wireframe modifier to the copy



Now render:


Geeks3D Forums

squarefeet ShaderParticleEngine

July 02, 2015 06:10 PM


A GLSL-focused particle engine for THREE.js. Removes a lot of the heavy lifting away from the CPU and onto the GPU to save precious rendering time in busy scenes...

Timothy Lottes

Sugar Free Peppermint Chocolate Chip Custard Ice Cream [SFPCCCIC]

by Timothy Lottes ( at July 02, 2015 02:49 PM

EDIT: Photo and updated ingredients...

Been experimenting with ketogenic friendly ultra-low-carb ice cream. This, along with bacon-wrapped sour cream, is one of the perks of the extremely high fat and low carb life style. First experiment didn't go as planned, placing the result in the freezer was a mistake. Apparently sugar in normal ice cream actually is the key component which enables the ice cream to maintain a great texture when frozen. Second pass, I'm keeping the result in the fridge. Ingredients for part one,

1 pint - whole cream
3 - egg yokes
1/2 tsp - peppermint extract
6 drops - pure liquid stevia

Blend everything together in a blender on high. Pour in a pan on the stove, stirring and slowly bring up to 160 deg F. Pour in a chilled container, store in fridge until chilled. Extra ingredients for part two,

1 - Lindt 90% chocolate square

Pour chilled mixture in ice cream maker, along with chopped chocolate square. Lindt 90% is better than the lower cocoa thanks to it's higher fat content. Churn until mixture looks like ice cream. Add to chilled container, store in fridge until chilled. Eat. Turned out awesome. Going to try higher peppermint next time and maybe another drop of stevia...

Game From Scratch

Torque 3D version 3.7 released

by at July 02, 2015 11:02 AM

Torque 3D is a now open source engine with a long and storied history. Today they just released version 3.7. Hands down the biggest new feature is the release or an OpenGL renderer, making a Linux beta client available, with other platforms down the road.


Other highlights of this release include:

  • We finally updated to PhysX 3. And Bullet 2.8!
  • We've integrated two former commercial addons, both of which were open-sourced: Sahara and Walkabout. The former allows you to easily add snow, dust, moss and similar effects to your environments using a technique the author calls 'material accumulation', and the latter provides an editor/tweaker on top of recast/detour autogenerated navmeshes, as well as integration with the exiting AIPlayer class.
  • We've had community contributions that add ribbon trails you can attach to almost anything, and a nice vignette PostFX shader.
  • Performance improvements to the scripting engine, as well as anonymous functions.
  • So many bugfixes.


Full release notes are available here.


Game From Scratch

Blender 2.75 released

by at July 01, 2015 10:10 PM

Today the Blender foundation announced the release of Blender 2.75. Not really a ton there from game developers specifically but a solid release all the same.


Image by Cosmos Laundromat -

Blender Foundation and the developer community are proud to present the official Blender 2.75 release. The main highlights of this release are:

  • Blender now supports a fully integrated Multi-View and Stereo 3D pipeline
  • Cycles has much awaited initial support for AMD GPUs, and a new Light Portals feature.
  • UI now allows font previews in the file browser.
  • High quality options for viewport depth of field were added
  • Modeling has a new Corrective Smooth modifier.
  • The Decimate modifier was improved significantly.
  • 3D viewport painting now supports symmetry and the distribution of Dynamic Topology was improved
  • Video Sequence Editor: Placeholders can now replace missing frames of image sequences
  • Game Engine now allows smoother LOD transitions, and supports mist attributes animation
  • And: 100s of bug fixes and smaller feature improvements.
Released: July 1, 2015.

Complete release notes are available here.


Game Design Aspect of the Month

Upcoming Workshop: Writing for Sci-Fi, Fantasy & Horror Game Worlds

by (Sande Chen) at July 01, 2015 03:54 PM

Hi! I'm pleased to announce a writing workshop I'm leading in conjunction with Playcrafting NYC. If you're interested in science fiction, fantasy, and/or horror and want to populate your game world with monsters, creatures, aliens, fantastical beasts, and otherworldly cultures, you can benefit from this participatory workshop.  It's next week, July 6th, details here.

I've written about the workshops I've attended to learn more about my own writing. I want this workshop to be about improving your work. I'll provide the framework but you will be the ones writing or developing your game world during the class.  Above all, let's have fun!

My background is a mixture of theatre, film, journalism, economics, and writing.  I received a S.B. in Writing and Humanistic Studies (now the major of Comparative Media Studies) at MIT, where I took classes on everything to do with science and writing, including science fiction.  My first game design doc was within science fiction; my first game writing gig, the space combat RPG Terminus, was science fiction.  Afterwards, I worked on the episodic fantasy series Siege of Avalon, MMO Wizard101, and the dark fantasy RPG, The Witcher.  As you might surmise, if you love genre fiction, then there may be opportunities waiting for you in the video game industry.

Sande Chen is a writer and game designer whose work has spanned 10 years in the industry. Her credits include 1999 IGF winner Terminus, 2007 PC RPG of the Year The Witcher, and Wizard 101. She is one of the founding members of the IGDA Game Design SIG.