This page contains technical details of the development of LOVE and contains edited excerpts form the news page.
Engine
This is a very technical post, but for people who are interested, I will now give you an overview of the rendering engine and technology powering LOVE.
The rendring engine used for LOVE is a custom built highly optimized OpenGL engine that tops out at around 200FPS running in 1920 by 1080 on a 100USD graphics card. While performance have always been important so has visual quality, compatibility, and most of all dynamic content. I try to use as few GL extensions as possible and beyond GL 1.2 the only ones I use are Cubemapps, FBOs, GLSL_100 and VBOs (Optional) giving me a very high degree of compatibility and portability.
|
|
Writing a good graphics engine is as much an art as it is a science, often there are many conflicting goals and optimization advice to follow. Sort front to back or state sort? CPU PVS or brute force GPU? Few state changes while avoiding uber shaders? Geometry or texturing? Looks vs Performance? Here are some of the choices i have made.
Terrain engine
The engine is geared towards productivity and the abillity to produce Procedural content. To do this the engine uses a custom terain primitive, that makes the entire world dynamic. Imagine a normal grid height field engine, now let each block store all 4 corner values independently, so that you can make cliffs. Then with a Boolean ad the ability to make a cave, by adding 4 floor values and 4 ceiling values. Finally you add a integer value to the top and floor surface that indicates the type of surface, depending on the type of surface you can replace the quad with a hand modeled object that has a square bottom, like a tree. This is the basic geometry in Love. It is so simple that i can generate landscapes, buildings, even cities, I can let players modify the environment and I can even make it destructible. Yes it is not as flexible as a polygon soup but i can do so much more, and with polygon replacement I can dress it up with hand modeled geometry.
|
The terrain engine is the by far biggest part of the engine and it also spends almost all graphics cycles. The world is build up out grids of blocks, and 8 by 8 blocks form a group. Each groups polygon mesh is generated by the engine and put in to a very large Vertex Buffer Object (VBO). By keeping all groups in a single large VBO I can bind the buffer once, then set up my vertex pointers (I use a interleaved vertex format for size and cashing reasons) at the beginning of the array once and then make a number of ranged draw calls for the groups that are inside the view frustum. Reducing state changes like this gives a massive boost to performance (especialy not having to reset the vertex pointers for each draw call), and i try to do it across the engines. The polygon array for each group starts with all walls, then objects, then floors and then finally edges and grass. This way I can draw different ranges depending on how far away the group is.
|
If the group is far enough away i only draw walls (the ground wont be visible due to the curvature of the planet) as it gets closer a draw more and more of the array until i draw the smallest details like the grass. Keeping everything in a single VBO does force me to do my own memory management to defragmentate the VBO. Given the size of the world I only store groups that are close to the camera in the VBO, so when the player moves new groups have to be added and old ones removed. All this however forces me to draw all terrain with a single shader, and the little texture mapping I do (i mainly flat map polygons with single colors for artistic and art budget reasons), is done by UV mapping around one very large texture.
To make my geometry more organic I add a quad that protrudes out from each polygon edge. This quad is then given a rounded profile using a alpha test texture. By lighting it and shading it the same way as the polygon it nicely blends and extends the surface. Normal engines have a lot of detail on the surface, with little edge detail where as I go in the opposite direction. The majority of the "Look" of LOVE comes from these very simple tricks rather then and complicated screen space computations.
Shaders
The main surface shader is a fairly simple one that uses normal texturing and lighting for local light sources. The sun light and ambient light are look-up in to two cube maps. These cube maps allows me to precisely control the color of each angle, and lately I have added a bit of colored noise in them to give some life to the environment. I refrain from filtering any of the qubemaps to create some extra banding that help establish the painterly look. The shadows are done with a simple shadow map. The complexity of the terrain unfortunately requires me to draw it in its entirety during the shadow pass, making cascading shadow maps a little too expensive. Finally I add a four color fog, with a near and far color, for the direction towards and away from the sun. This fog algorithm is then replicated in the sky box to match.
|
|
FX
The game employs a lot of FX and they are all based on two primitives, sprites and quads expanded along a axis (lines). Both these are expanded and aligned towards the camera inside a vertex shader. A major problem with FX work is that you end up with very many draw calls, sometimes only containing a single polygon. Therefor I consolidate all sprites and lines in to two draw calls by accumulating them in to large arrays. Again this means they need to map around large textures rather then giving each sprite and line type its own texture. Objects are collected in a similar way so that they can be drawn all at once. The only FX not done using these arrays is the mist. The mist is made up by large sprites that travel over the surface of the terrain. To get thick enough mist I end up with a lot of overdraw, especially if many mist sprites get so close to the camera that they cover the entire screen. To combat this I have special shader that fades out the sprites as they get closer to the camera, and if they are close enough, it simply doesn't expansion the sprite, and the rasterizer ends up rasterizing an infinitely small polygon.
Post Efects
The entire frame is drawn in to a off screen buffer (Using FBOs) and then drawn with a color correcting and filtering shader to the screen. To improve the performance and soften the image I draw the off buffer at slightly lower resolution then the screen resolution, this has however very little impact on performance since I'm almost completely vertex bandwidth bound.
When writing a custom engine like this, and especially when you do non-photorealistic rendering you need to spend a lot of time experimenting and trying things out. Some parts of the graphics pipeline has been totally rewritten from scratch four or five times. As for the future there are some things i am still considering adding is a image space distortion map, but there are too few reasons to use it and the cost of switching FBO state is unfortunately a bit high. Other possible additions is to use a float buffer for the off screen rendering but again the bandwidth cost is far to high for the added quality. I would consider having these things optional, but it has been along standing goal of mine to present good visual quality even on low end rigs, and avoid the feeling that you are not experiencing the game it was meant to be.
Production
While the Love project is a lot about making a great game is also about finding a way to become so much more efective that you can make a game like this with a very small team. If I can make a graphically impressive game alone I would surely prove that my ideas can work. In the past hardware was so limited that we had to focus most of our energy to make it draw pixels and polygons, now however we can draw so much that the problem has shifted to being able to produce the polygons and pixels to draw. Moore law states that we can draw twice as much content every 1.5 years. So I'm suggesting a "Eskils law of game development" if you will, stating: if you cant double the amount of content per man hour you produce every 1.5 years the way you are working is unsustainable. Today's we have teams of up to 200 people, if we imagine that the next console generation will be 8 times as powerful, does that mean that we need teams of 1600 people? Even a doubling of the team size is quite unsustainable.
|
This is not a question of just money. This is about making good games. Good games comes from being able to iterate, make changes and having time to do so.
In the past, tool development has been focused on solving the hard problems, like cloth, hair, and fluids, but I think we need to focus on the simple things like making polygons and pixels, because that's what we spend most of our time doing. It is no longer about what we can do, its about what we can get done. When we design tools and our pipeline we need to be constantly conscious of the time it takes to complete each task. If you say that creating a character can take no longer then 2 days, an engine developer cant ask the artist to produce more detail or additional data like specular maps, without first automating or simplifying another aspect of character creation in order to give the artist time to preform the new tasks. The development of technology must be take even steps with the development of techniques to create the content that feeds it.
|
Internal development of tools should be a bigger effort then engine and game programming. Tools have a longer lifespan then engines, reduce art costs, and is therefor a way better investment then art assets or engines. My opinion is that you also need to release your tools outside your studio. It stops you from making quick and dirty hacks because "No one will see it anyway", it forces you to document, you can hire people who can use your tools, but most of all, if you make a tool that isn't specifically designed for your engine/team/problem you will make it more flexible, a flexibility that will eventually be paying you back by letting you discover new things you didn't think you wanted to do. Releasing the tools, especially if you do it commercially, gives weight to the tool development as an important part of what you do. If the tool development brings in money by itself it becomes harder to question its existence. Some people think that releasing your tools is to release your competitive advantage, but you will always have the advantage by having access to the developers of the tools and the latest version.
LOVE has been developed using the many tools you find on nthis site including Loq Airou and Co On.
The pipeline
I obviously use Verse as the back bone of my pipeline. Verse is a network protocol that gives all tools and the game engine itself real time access to all assets. If a change is made in a tool by one user all other users will see the change in real time in all tools including the game itself. It completely removes the need for files and everything is WYSIWYG. An awful lot of problems associated with asset management simply never occurs.
One very powerful feature of verse is that it has a universal data format that is not directed towards any particular engine or usage. If the artist creates a object with a material he/she can control its shape and its surface properties, then the export code that loads the data in to the game will convert the shapes to polygons and the properties to shader code. If you want more or less polygons you simply tweak the export code, you don't tell your artists to re do their work. If you have a new lighting model, you don't have to ask your artists to use it, you simply make the exporter include it when it generates the shaders.
|
|
In designing a workflow for game development a major goal has been to remove the need for much of the in-studio communication, especially between programmers and artists. Lets say you are making a game with a tank. You need a model for the tank and on for the turret. Normally a programmer would have to request the graphics from the artist, then agree on its naming, make sure it has the right orientation and size, then wait for the artist to make at least some mockup graphics before the programmer can start working. A better solution is to have a API that the programmer can use in the code to request an object called "tank", that is roughly a green 5 by 3 by 1.5 meter box. The APIi will then look for the geometry, and if it isn't found, it will create the stand-in box instead. This way the programmer can get to work immediately. Next time the artist looks in the verse server he/she will find the geometry, with naming, and rough proportions already there ready to be refined in to a proper tank model. The request in the API is turned in to a request in the asset management system. If the tank or any other asset would ever be lost or deleted the engine will always be provided with stand-in data so that it doesn't disrupt the work of others. This system is also used to store game play data. If a AI programmer needs a setting for a character, say how many health points they have, they simply request a tag value, set a default value, and then the tag will appear in the interface of to tools.
|
Tags are also used to keep track of the progress of development. In my asset management tool Co On you can configure the 3D display to color code, draw text or icons on different objects depending on their tags. You may make objects that are critical red, or you may make objects assigned to you as blue. This system can also be configured depending on you task, a modeler may see a completed model as green, while a texture artist sees it as red indicating that it needs texturing. There is no paper work, no internal web forms, everything is right there in the 3D view of the tool. For the LOVE project this kind of functionality has clearly been shown to be over-kill since comunication on a single developer project rarely is a problem.
|
-Learn more about programming and how to get started read this News post.
-Want to mail Eskil? Go ahead.
-Want to know more and keep up with the development? Read the News.
| |