Skip to content

Behavioural House Demo Video

[vimeo http://vimeo.com/22991321]
Advertisements

Skydome problem solved

I posted my problem on the Blender Artists forum to see if anybody new what the issue was, all I needed to do was to change the clipping distance of the camera, now it works beautifully.

Final year project: Big progress update

I haven’t updated my blog for a while as I have been working hard on my Final Year Project, I should’ve really kept it updated often with progress (oops) so in this post I am going update my progress with my project in one big post.

Over the last few weeks I have been battling with Blender to try and get it to do what I want it to do. I talked to Pete Carss about the Blender Game Engine and he said it isn’t limited, but you just need to do things in a different way. And by god he was right, I have tried so many different methods to do certain things!

So this is what I have working:

A starry skydome which acts as the environment for my project, it has nebulas and stars scattered in places, more detail later on

I have movement set up, so the camera follows a small sphere which will represent the object which the audience holds. The sphere also has a refraction / reflection script attached to it so the sphere looks similar to a raindrop or bubble. At the moment when I move the mouse the camera will rotate and you can move position in the space. This action will be changed soon, so the user will not use a mouse, but it should change depending on the rotation of the object they are holding.

I also have star particles! This took a lot of research to complete, but I wrote a simple script which does exactly what I want it to. More on this later.

1. Skybox/dome

So firstly I did a lot of research into sky boxes vs skydomes. If done effectively, sky boxes can look amazing, it can look as if there is no box at all, but a sphere. It requires a net drawn out in Photoshop, Gimp or any other image editor. So I worked on producing a few space scenes in Photoshop and mapped them to a net. This was my first attempt:

After the net was created, I copied each of the faces and added a 5 pixel around each edge and took 5 pixels of the joining edge and pasted it to cover the seam. This had to be done for each of the edges to make the box seamless.

In blender I created a cube, flipped the normals and unwrapped it. I then added a texture to each face, and made the material shadeless. From then I could scale each of of the UV faces so they fitted together correctly.

I found the process was quite long and laborious, and the result of my first attempt didn’t work out as well as I had hoped. This is the result:

You can see it is a box, as the seams are quite obvious. Also I think it looks abit too close, the box is actually quite big, but I think I need to make the nebulas and details smaller to make it seem further away.

I then thought about using a skydome instead. The process was much faster than creating a net. All I needed to do was create two images, one for the top dome and one for the bottom. The great thing about making space scenes is the starry sky looks very random and it’s hard not to make it look the same as others, which means when it comes to merging two edges together, it’s very easy.

These were the two images I used for both domes:

In blender I created a sphere, went into edit mode and removed half of the vertices, I then duplicated the top of the sphere and rotated it so it matched the top. I then had 2 different domes which were connected to each other. I then attached each texture to it’s corrisponding dome. The result was MUCH better. The edges were seamless and it looked like a proper environment. My only problem which I have at the moment is that when I get so far away from an edge, or if I scale the spheres up too large, a blog appears on the screen which only disappears if I get nearer to it. This of course is a bit of a problem, but it’s something I can look into. I’m sure there is a fix for it somewhere.

Here is what it looks like:

The stars took a lot of research to complete. I really wanted to find a dynamic way of placing 1000 stars in the scene randomly. Obviously the first thing which came to my mind was particles. Blender has a particle system which ranges from emitters to static particles for grass, hair and fur. After trying lots of different methods I soon found out the particles systems do not work in the game engine. So I would have to find a different way of doing this.

I asked on the Blender Artists forums about how I would do this and a man called SolarLune said that he had built a particle engine for the BGE (X-Emitter). Ideal! I did some research into it, and after testing it was apparent that the system was built for Blender 2.5x, I am running 2.49. I have to stick to 2.49 as 2.5x apparently does not have dome functionality built into it.

I then thought, would it be possible to generate 1000 small spheres with a low amount of segments and rings through python? I did some research into generating Game Objects in real time and I found exactly what I needed. I then modified the script to spawn my sphere, randomised the position of the stars and finally wrapped it in a loop which ran 1000 times.

Here is the script:

from Blender import *
import Blender
# get controller
controller = GameLogic.getCurrentController()

# get  current scene
scene = GameLogic.getCurrentScene()

# get the object this script is attached to
player = controller.owner

# random number between 2 numbers
def rnb2n(min, max):
	return min + (max - min) * Blender.Noise.random()

if player['Init'] == 0:
	a = 0
	while a < 1000:
		a+=1
		star = scene.addObject("Sphere", "Empty", 0)
		_x = rnb2n(-34, 34)
		_y = rnb2n(-34, 34)
		_z = rnb2n(-2, 10)
		star.localPosition = [_x, _y, _z]

	player['Init'] = 1

The script runs beautifully, and I couldn’t have asked for a better result. I really wanted small particles to fly past the camera as I feel it gives it a lot more depth.

I will upload a video with all these elements applied as one to show you what it looks like soon.

There is still a lot to do on this project. I need to create clusters of stars, which symbolise a path which the audience will have to pass through. This will take a long time. I also need to integrate the controls between the arduino board and blender. I have a basic version working, but not with the new movement scripts.

Final year project build – Problems and work arounds

I haven’t posted anything about my final year project for a while, as I have been working with Blender, Python, Processing and Arduino to see if I can get a basic prototype working. What I’ve learnt over the last week: Python and Blender is tedious, Blender’s game engine is incredibly limited with what it can do and there seems to be massive work arounds for some things which should be relatively simple. This post goes through the tasks I have attempted to do, the problems I have faced and what I have done to get around them.

 

1. Creating stars

On any tutorial site for Blender to make stars it would tell you to create a UVSphere, delete everything but the vertices and texture the vertices with the Halo effect which will give it a glowy, shiney effect. It all looked great when I rendered the current frame, however when using the Blender Game Engine, nothing showed. After spending a long time trying to find out what the hell was going wrong, someone told me that the Halo effect does not work in the Blender Game Engine (BGE).

So I thought about different ways that I could create glows around a sphere to create a star effect. Someone online said that I should create the normal sphere and then add a plane in the centre of the sphere, then add an always sensor, controller and an actuator to the plane, on the Actuator, make it an Edit Object and allow it to track object, then target the camera. It will then always rotate so it points towards the camera. However using this method, I’m not sure how memory intensive this may be. I plan on having thousands of stars in my scene which I can fly through. If every star has an always sensor attached to it, it may slow the application right down. Am I right in thinking that an always sensor acts the same way as an EnterFrame event in flash?

stars

 

 

 

 

 

 

2. Creating a realistic Nebula

For a little while I was thinking of how I would be able to make a realistic nebula which I would be able to fly through, I thought one way around it would be to create a particle system which would be similar to smoke or clouds. Then somehow change the colour so it’s multi-coloured.

I found a really interesting method which ElusivePete off youtube explained for C++ games. http://www.youtube.com/watch?v=CaTI2d0tQME&feature=related. He explains that what someone could do would be to create a lot of inter linking planes with different textures added to create a 3D-ish nebula. Then using some maths attached to each plane to analysis where the camera is relative to the plane. From there we can decide whether to fade the plane. Basically the plane should be fully visible when the camera is facing straight on, but invisible when the camera is side on. This applied to lots of different planes would cause the nebula to fade into each other and therefore create a seemless nebula effect. The great part is that the Blender Game Engine wouldn’t be able to create this sort of effect without using this method, so in a game to fly through it would look beautiful.

Over the weekend I tried to convert this to Python and Blender. I created a plane and attached a texture to it. I then added the python code and the maths:

import GameLogic as g
import Blender
cont = g.getCurrentController()
own = cont.owner

from Mathutils import Vector
from Mathutils import Matrix
import math

scene = g.getCurrentScene()
objList = scene.objects
camera = objList["OBCamera"]
#pos = Vector(own.position)
#Vertexnormal == Facenormal in the case of a plane
normal = Vector(own.meshes[0].getVertex(0,0).normal)
camVec = Vector(own.getVectTo(camera)[1])

x = own.localOrientation[0]
y = own.localOrientation[1]
z = own.localOrientation[2]
normal = Matrix(x,y,z)* normal
alpha_rate = abs(normal.dot(camVec))
print alpha_rate

own['alpha'] = alpha_rate
for x in [0,1,2,3]:
        own.meshes[0].getVertex(0,x).setRGBA([0,0,0,alpha_rate])

So what this is doing is:

  • Getting the Matrix position of the plane normal
  • Getting the Vector of the position of the camera relative to the plane
  • Normalize both
  • Use the Dot Product equation to work out the cosine angle between the 2
  • The dot product equation will return a number between 1 and 0, 1 when the camera is straight on and 0 when the camera is side on. Perfect for manipulating alpha values.

After all this work it was time to see if I could lower the opacity of the plane. This line of code at the bottom of the above code:

for x in [0,1,2,3]:
        own.meshes[0].getVertex(0,x).setRGBA([0,0,0,alpha_rate])

This code worked on a plain material, however when I added an image texture to the plane, this stopped working. I spent a long, long, lonnng time trying to figure out how to get the alpha transparency to work with an image attached, but I just can’t get it to work! The only way I could get it to work would be to tick the ObColor option, and this turned the image black, so this wasn’t possible. I even tried using an IPO to create an animation of fading out, however this did not work either.

I’m completely stuck here. I’m so close to creating this nebula, but something as simple as changing the alpha values of a plane on the fly doesn’t work! I’m getting so frustrated with Blender and I’m tempted to throw in the towel for the BGE and start using Unity.

Stuck

Now I’m in a position where I don’t know what to do. I would love to start again and try to use Unity instead, I find that the Blender Game Engine is far to limited to do what I want to do, and there are a lot of reports which say that Unity is a far better game engine than Blender, not only because it’s far more powerful, but because it’s also much easier to use.

My only problem is that I have been put on this plugin for Unity called Omnity which domifies unity content so I can use my game engine in the dome. However I have tried out the software and I am getting errors. I have emailed the company Elumenati for some help to see what is going wrong, however they have not got back to me yet. I was testing on Unity free license, so I don’t know whether Omnity requires Unity Pro to run. In which case I wouldn’t be able to use it as it is far to expensive for me to buy.

I don’t want to waste more time trying to figure things out in Blender when I could be building something way better in Unity, however I don’t want to spend loads of time in Unity then find out I wont be able to Domify it. I’m stuck, I’m not sure what to do.

Applying RGB LEDS to the house

I was thinking about how I could apply these colours to the house on different events, I came up with a list of different events which generally happen within a family.

  • Coming home after a stressful day: Turn lighting to a warm yellow
  • Going to the doctor, dentist or vet: Turn lighting green
  • Arriving home after doing sport, turn lighting to a cool cyan
  • Party in the house – use disco lights!
  • Lots of events happening at work or school – turn red
  • When people wake up, slowly show the lights get brighter
  • Check the temperature outside, if it’s above 20 degree’s turn lighting to a cool colour, if below 5, turn to a warm colour.
  • Turn lights on individually, for example, bathroom to warm lighting if having a bath or shower.

One major factor which I could take from the psychotropic house story by JG Ballad is the house has a memory, so the house could remember certain moments in the house, and react to them in the future.

RGB LEDs

To advance with my production of space project, I’m going to be adding some RGB lighting to the house, so the colours of the lighting can change depending on the calendar and routine. I have bought some RGB LEDs, a prototyping board and some jumper wires off oomlout. I have got the basic example working, where the lights act like disco lights, every half second they change colour. There is also an option to fade from one colour to another, the problem being is each LED needs 3 PWM pins each. An basic arduino board only has 6 PWM ports. So I can enable fading on 2 LED’s, but the rest will have to be instant. The advantage about using the fading pins is you can get any colour you desire, however without the PWM pins you can only get about 8 colours:

  • Red
  • Green
  • Blue
  • Yellow
  • Cyan
  • Magenta
  • White
  • Black (off)

However this will be sufficient for my project. The pins are pretty bright already, but to make it stand out properly I might create a lampshade made out of tin foil to reflect the light and hopefully create a brighter light. I will have to experiment!

Here is a few photos of my example:

Example1

 

 

 

 

 

 

 

 

Example2

Research into meditating with objects

I have been doing a little research into meditation with objects, I wanted to find out if there were any techniques which were commonly used to meditate, but with objects, as I could use it within my project. I found a quote from The Glass Temple, they describe how to meditate with an object:

Now, picture your object, whether in your eyes, or in your mind. What is it’s shape, what is it made of? Imagine you are touching it. Is it hard or soft, is it smooth or rough? What is it’s temperature? In your mind, run your finger down the side. What does it feel like? Feel the texture on your finger. In your mind, hold it in your hands. Is it heavy or light? Does it shine, is it dark or light? Now look a little deeper, below the surface. What does it look like inside? What does it feel like? Feel the texture of it on your skin. Do you see colour? Now, expand a little.. what is its nature? What is its purpose? Is it happy? Does it like being what it is? Does it like doing what it does? Try to imagine you are the object. What do you feel? Do you think anything? Do you have emotion? Do you have knowledge? What do you know?

All of this can be done just by feeling the object. I can do this easily with my project, as the person using the project will be feeling it and using it. The quote above also states “Try to imagine you are the object”. At some point in the project I could zoom in to the object and as the person turns it, the whole screen turns which would create a really nice effect in the dome, it would create the effect that you are actually moving. The really interesting part of this would be investigating the user’s relationship with the object. When they go into this state, how will they react with the object?