Working exclusively with W/ARE presents both an incredible opportunity and interesting challenges for us as developers. Their EEG and tACS systems allow us to receive an incredible amount of direct feedback from the player. However, as the game itself is directly induced into the image processing regions of the brain, it makes it challenging to express what the experience is actually like.
In order to be able to share the game with our audience as it develops, we have dedicated a great deal of our development time is focused on creating a virtualized neurological image-processing analogue system—basically an AI brain—that can process signals from the game engine and output graphical results in .jpg and .mpeg formats. We’ve nicknamed the system ORTHAGON.
While both the game and the ORTHAGON Graphical Processing system are in very early alpha, we would like to share some of the test assets that the system has generated for us. These assets have been generated based on descriptive text provided by community volunteers.