A quick video that shows some game play and explains the main game mode of the game.
A quick video that shows some game play and explains the main game mode of the game.
This post follows the process I take to model and texture objects within Velicous Racing.
The process begins in Maya, I first block out the basic shape using the existing track block out I have saved in Maya.
Once the basic block out is complete I move the checkpoint into its own file so I can finalize the shape and add fine details. At this point I also UV map the blockout, this is so that if I decide to bevel an edge or champfer a corner the UV map is update automatically. I also UV map the sign portion of the checkpoint separately so a tile-able texture can be used.
Now the UV mapping is done i begin to add basic low poly details, such as edge bevels.
With the basic low poly finished I move on to creating the high poly model so it can be baked for normal maps, so that the edges of the low poly model display correctly. More details will be added to the normal map later using the Quixel Suite.
With both the low and high poly models created I go ahead and create a cage for both parts of the model, because only one part of the model(not the flat screen) needs normal maps I will be baking that part of the model only
With the basic normal and ambient occlusion maps baked, I can bring the model into the Quixel suite, specifically so that I can use NDo to create a more detailed normal map.
Above model loaded into Ndo with no adjustments made to normal map.The video below shows me created a more detailed map for the prop using NDo.
Once the more detailed normal map is created it is time to move the prop into DDo within the suite so that the other texture maps can be created. Displayed below all the maps that DDo requires in order to operate properly.
The video below show my process of creating the other texture maps within Quixel.
With the prop textured I can import the asset into Unity and assign the various materials. Because I keep the emissive map completely back and white I can now adjust this value in the editor which is how i am able to change the color of the checkpoint to suite its surroundings.
The final step in creating the checkpoint is to animate the lettering, because I have created the object in two parts with two UV maps I can simply offset the texture through script which will give it the appearance of moving. The video below illustrates this,
This post covers the production of the first enviroment.
The first step in producing the enviroment was to create a basic block-out of the track within Maya according the concept.
The first step of this process is to create a simply plane with the desired width of the road and to UV map it. This plane will be duplicated many times to create the final road. It is best practice to leave some divisions on the plane, as it will need to be deformed and the division will give a cleaner deformation.
The next step I take it to create a CV curve that follows as closely as possible the track concept design, this curve will be used to deform the road planes. Although this curve does not represent the final shape of the track, it does provide a basis to build from and this is why we must follow the concept closely. If the track is made up of multiple individual sections, then more than one curve can be used to create each section.
Once the curves are complete, I begin to measure the length of each curve and create the corresponding road sections. The length of each curve can easily be measured in Maya using the ‘arclen’ command.
Once the length of each curve is known, the number of planes needed to create each section can be calculated using;
‘curve length / piece length = number of pieces needed to create section’
I then use duplicate special on the single road section created in step one, the number of duplicates is calculated using the formula above.
Next I connect the created road section to corresponding curve. This is done by first combing all of the road pieces into one single mesh and then attaching that mesh to the curve using ‘Attach to Motion Path’. Finally, the combined mesh can be deformed to the shape of the curve using the ‘Flow Path Object’ option within Maya constraint menus. The resolution of the Flow path can be controlled using the division attribute.
Further adjustments can be made by moving lattice created by the flow path object, once I am happy with the road section I delete the construction history of the mesh. This disconnects it from the curve and flow path,leaving just the static mesh behind. Finally the mesh can be be ‘separated’, this step undoes the ‘combine’ step taken earlier and breaks the mesh back into the singular planes it is constructed from. The final result then is multiple singular planes each with individual but identical UV maps, which have been deformed to create the basic shape of the track.
The same process is repeated for each curve and section of the road until the full track block out is complete.
I import the blocked out track into Unity and also create a terrain asset. So that I can begin to block out the actual enviroment within Unity.
The next step was to block out the basic areas of interest according to the concept, for this I use Unity’s prototype assets as they are perfectly scaled to various sizes and can therefore be used as a size reference without actually needed to import them into Maya.
Once the basic block out is created within Unity, I begin to create assets in Maya. The block out references are used for as a size reference so that the props fit in their according place.
Some shots of props created in Maya,
Once props are created in Unity they can be brought into Unity to placed in their final location.
Some props are duplicated many times within Maya before being sent to Unity, as Maya snapping and organizing tools are more powerful than Unity’s.
For instance props, can be parented to another prop before being brought into Unity, within Unity the child prop can simply be zeroed out and the prop will be in its correct location.
Once 95 percent of the props are placed in their correct position, I begin to light the level. The first step I take in this process is to adjust the ambient light in the scene to replicated sunlight or moonlight. Because this will be a night scene I am going for a moon lit feel.
I first change the sky box to suit and then adjust the ambient light until I feel like I have achieved a moonlit feel. From studying moon lit pictures I have decided moon light is generally pure white or a weak blue. In my scene I tried both, with blue giving the best results.
Once the ambient light is complete, I move onto creating Fog. Fog within Unity can be used to hide areas you do not want the player to see, such as the horizon line for instance. It also give distance objects a more realistic feel, as in real life the further an object is from the view point the more blurred it appears.
The next part of lighting the scene involves placing iridescent materials around the scene along with light objects. I manually place lights where they are needed, there should be no black areas in the scene where the player can access. If the player can reach that part of the enviroment it should be lit.
With the lights placed in the scene I can finally bake the pre-computed GI information. This information allows Unity to compute light bounce at run time, so that shadows and objects are indirectly lit by other objects. This gives a much more realistic feel to the scene and also help improve lighting in the scene.
With the lighting complete, I call the enviroment complete. Although I will be continuously making small changes.
This post covers how I went about modelling the 3D vehicles for my FMP. The vehicles will be modeled using Autodesk Maya 2016 and textured using the Quixel 3DO Suite.
The vehicles are modeled according to the concepts created in earlier post.
Austin Mini Cooper :-
Using a blue print reference to get the correct dimensions, I first created a ‘cage’ using the curve system within Maya.The cage follows the outline of the vehicle precisely and will be used to create the first geometry.
Once the cage is complete, Maya’s ‘surface’ tools can be used to create NURB surfaces that follow the shape of the curves created in the previous step. Although NURB surfaces cannot be used within a game engine, they can be converted to polygon geometry within Maya which can be used in game engines.
Because the curves created in the first step did not all have an equal number of points, the surfaces they create and therefore finally the polygon topology is undesirable.
The topology poly count is far to high and the impracticable shapes of the topology with produce bad reflections.
I therefore set about retopologizing the model using Maya’s quad draw tool. This tool allows me to preserve the original surface while tracing a new surface to it.
Next I add details to the model, I add details to the car so that it looks original. So that when I come to modify the model to fit the concept, it will appear to be a modified mini and not some other vehicle.
Once the ‘original’ car was complete, I modified it following the concept image I had created earlier. The renders below show that modified model, plus a render with some basic materials applied within Maya.
Now that the basic model was complete, the UV mapping process could begin. As much of the car is mirrored, so can the UV’s be. By UV mapping half of the car and them mirroring those UV mapped parts, so are the UV’s mirrored. This not only save me time, but also saves space on the UV map ultimately allowing me to add more fine detail to the texture maps.
Now that the model is UV mapped the texturing process can begin. Because Quixel’s 3DO suite requires a ‘base’ normal map, the texturing process actually begins in Maya.
Using the low poly model already created, I go ahead and create a high poly version.
I can then take these two models into a baking program such as XNormal, so that a ‘blank’ normal map can be baked from high to low. This normal map contains only the UV boundary information and smoothing information that Quixel utilizes to generate effects.
Because I do not want to bake each piece of the model individually within XNormal, I create an ‘exploded’ version of the model. This exploded version has the same UV information as the un-exploded version, so any data baked from the exploded version should also work on the un-exploded version. Creating an exploded version of the model, reduces the chance of any baking rays being blocked by other geometry.
Now that the ‘base’ normal map is created, the model can be imported to Quixel. Within Quixel NDo is used to create a normal map. Small details such as dirt or rust are added into the normal map using NDo’s various features.
Once the normal map is finished, Quixel’s DDo tool can be used to create the other PBR maps specifically for Unity a diffuse, specular, ambient occlusion and emission maps are also needed.
I created two versions of the car, a clean version and a dirty rusted version.