• Loading No Place For Traitors

    Loading No Place For Traitors

    Although it might not seems so, we have been really busy working on our prototype and I am quite happy to announce that we have now scripted all of the puzzles intended for the demo - up to now only in german, but with the scripting we have completed also the animations, most of the gfx and of course all the in-game realtime-models. (The image above is the loading - screen).

    So we are now adding some effects, programming some necessary shaders and will be doing some first tests concerning the puzzles (which means, having friends play the prototype and look over their shoulder to see where they get stuck). Also, the guy for the sound-design, Daniel Migge, will soon be starting to collect and create the needed sounds.

    So we are on good way and I hope not before long we are able to send the prototype to some publishers. Stay tuned!

    Give us some love:

  • Quick Tip: Transparency (Alpha) in Unity

    It has been some time since I posted anything. This was due to work I had to do for clients, because - as some of you may have noticed - Serious Monk is not a fulltime-job... yet. It is rather the symbol of my wanting to do this full-time and hopefully will one day result in exactly that. Up until then, I will have to work for clients every once in a while to gain the money to have this thing up and running. 

    Nevertheless we were quite productive the last weeks and I wanted to share a solution to a small, but significant problem we encountered: Transparency in Unity. As you already know (if you read the other articles) we 'fake' our 3-dimensional room, that is the characters are real 3D models, but the rest is plains with high-res-textures on them. For furniture like the table in the Refektorium or the oven in the kitchen, we use plains, with a .tga-file UV-mapped onto them. The *.tga-file has an alpha-channel, which allows us to see the background where the table or oven has ended. 

    We do changes the same way in the scenes, for example when a character picks something up - we just fade in or fade out a plane with a corresponding texture. Of course, simple objects are not full-resolution, but the plains are modelled in the smallest possible size. We had a situation where a big plain (green border), simulating the changing of a door was behind a smaller plain (red border), simulating the change of a detail in the door.


    Unity Transparency Tip

    In unity, a shader with transparency will be sorted by distance to the camera: objects further away from the camera will get drawn first, then the objects closer to the camera. In blender, the smaller plain is in front of the bigger plain, but in unity it was quite the opposite. That has to do with the way unity calculates the distance, which is maybe not too obvious: Unity takes the center of the object - the center being the center of the bounding box, NOT the origin - and draws a line to the camera. The length of this line results in the distance, and thus in the order of drawing. 

    In our case, the center of the smaller plain was further away from the camera because it is further down, resulting in a greater distance concerning the Top-Bottom-Axis.

    Unity Transparency Tip 2

    The solution is simple (if you know it): You just have to re-position the center of the bounding box (right arrow) by adding one extra vertex (left arrow) and placing it behind the plain. Like this, you can influence the distance to the camera while having the plain and textures stay on the exact same spot. Now unity draws our transparent plains nicely in the order we want it to... YEEHAW!

    Give us some love:

  • Antonius Moodstills re-work

    Antonius Moods 2

    So the last days I have been busy with other jobs, and it is only today that I have been able to do some work on 'No Place For Traitors' again. I was scripting some of the dialogs when I noticed, that the 5 'Emoticons' of my main-charakter don't seem to cover all the necessary emotions I'd need. So I did, what I should have done before starting the 'Emoticons': research! BUT, no harm done, the frames I did already will still be used.

    So I started to read about the basic emotions - and there are sure a lot of different theories concerning these 'basic' emotions. In addition, I realized that it is not necessarily to cover all basic emotions (for example LOVE), but it would be necessary to create some additional 'states', that would reflect the adequat reaction in a dialog (for example being curious).

    So I defined 8 different 'states' and 'emotions', based on the basic emotions Love, Joy, Surprise, Anger, Sadness and Fear. In the still you can see from top-left to bottom-right: afraid, angry, curious, disgusted, normal, smile, sad, surprised.

    In the prototype I will have all these emotion only for the main-character, Rafael's and Oswald's 'Emoticons' will be created when needed.

    Give us some love:

  • Blender to Unity - the general structure

    Hi there folks,

    this blog is getting more attention, which, of course, is a good thing. Some of the mails I have ben getting lately have been inquieries concerning our workflow Blender -> Unity, especially concerning the interactive objects and the walkable areas. Well, here it goes, a more detailed description of what we do:

     Blender to Unity, Tutorial 02

    1. The Images

    The first image shows the empty scene, in the case of the kitchen these are three planes, UV-Unwrapped and textures with the background, the oven in the middle (tga with alpha-channel) and the stairs to the left and the right (also tga with alphachannel). Another view of the scene can be seen in this article. The planes are Tracked-To the camera, like that they are facing the camera directly. As Unity and Blender have different unit-system, it can be quite difficult to get the camera distances right, including the focal-lenght. So we use an Empty to transfer the position and facing-direction of the camera to unity - the focallength is being adjusted manually using the upper and lower border of the background-planes for guidance.

    2. The walkable Area

    The second thumbnail shows the walkable area (blue), a simple mesh, which will be hidden in Unity. The character can only walk on this mesh, no matter where the player clicks. A simple pathfinding is done to get to the nearest point  possible to the clicked spot.

    Blender to Unity Tutorial 3 Blender to Unity Tutorial 4

    3. Interactive objects

    With interactive objects I mean clickable, static objects (in contrast to other characters). These include objects that can be looked at, objects that can be taken and also doors - in short, everytime you are interacting with the scene, you click on a hidden mesh - these meshes can be seen in the third thumbnail in yellow.

    4. Sprites

    Sometimes, when you take an item you need to display the change in the scene. We do this by placing sprites in front of the textured planes from (1). These can be hidden or displayed, and like that simulate change in the scene. You can see it in the fourth screenshot with an orange outline.

    5. Naming Conventions and Import into Unity

    As we don't have much possibilities to transfer information from Blender to Unity, we use the names of the meshes to be able to identify the objects and connecting them to our xml-files (see this article). So, for example, an interactive object is called HG_23_Knife_K. These strings can be parsed by unity and result in information: HiddenGeometry - ID 23 - name: Knife - room: Kitchen. Like that, we can not only separate the different kinds of objects (walkable Area, interactive Objects, Sprites, Background-Images), but also automate tasks like importing and setting certain attributes with a simple script in unity.

    6. Scripting with the Online-Tool

    The IDs of the objects are unique for each object and are handled and generated by our Online-Tool, a php-script communicating with a MySQL-Database. Each interactive object will have an xml-File, generated by the tool and describing for example certain actions to be taken when clicked. Concerning the xml-files we also use a naming convention and Unity hereby is able to connect an xml-File to a certain object in the scene by comparing the IDs.

    Well I hope this helps someone understand a little bit more, how we do things. If not, use the comments or the contact-form :-) Happy blending and uniting...

     

    Give us some love:

Serious Monk Serious Monk Serious Monk
Editor | COA | Admin


We are an independent gamedesign studio based in Cologne, Germany.

serious monk
Manuel Schmitt
Maybachstrasse 155
50670 Köln
+49 221 93825101
Write us an Email





About this Webpage:

This webpage is a developer-blog, providing information for the interested about the production of our computer-games. This shall not only serve as a product-show-off, but also as a platform for other developers to read about our problems, our solutions and thoughts during the development.

We will offer tutorials, tests, opinions related to indie-game-development and of course information about the games we are currently working on.

Stay tuned. Happy gaming.









IMPRESSUM | ADMIN