Last Guardian – Game of the Year 2016?

I finished the Last Guardian over Christmas, and it is my favourite game from last year. In my final year module, next semester, I will be focusing on teaching about programming AI in games. From this perspective the creature AI in the Last Guardian is fascinating. For me, the gameplay, story and puzzle design are more absorbing and immersive than those in Uncharted 4: A Thief’s End. However, it’s the relationship between the player character and his creature companion, Trico, that makes the game stand out from other games. At times the interaction with Trico can be frustrating but when it works the action can be breathtaking, and as a player, you can become very attached to Trico. The fact that Trico doesn’t always do what you want it to do can add to the realism of the creature intelligence, especially if you own pets and can relate to real animal behaviour. The imperfection of the AI reminds us how difficult the task was and how well it has this has been achieved. The Last Guardian provides a state-of-the-art example of how to create AI companions that are essential to gameplay, and to whom the player can develop a strong emotional connection. I consider the game not only rewarding to play due to the story, world and puzzle design but also found the end of the game moving in a way that I usually only experience through reading books or watching movies. It is has been argued that Trico provides a role model for chat bot agent design, such as Alexa, Siri, Cortana, and Google.

I played fewer games last year than I would usually but I did complete Doom and Inside, which are two fun and well-designed games. The two most voted games of the year seem to be Uncharted 4: A Thief’s End and Overwatch, both of which I have played quite a bit and have enjoyed. However, if you consider yourself a student of games and are interested in game design, then the Last Guardian is clearly the most novel and worth the time that you would spend on it (10-15 hours). Don’t miss out by not playing it.

ARToolKit now available under LGPL v3.0 for free use

ARToolKit http://artoolkit.org/ now can be used for free in commercial applications under LGPL v3.0. When working on James Burkes PhD, which focused on AR for games based upper arm stroke rehab, we found that ARToolKit was easily the best tool (and was free for academic use). After completion of the PhD we looked into commercialising the games but the cost of the commercial license was a real problem going forward with this. The new license terms are obviously very helpful for people wanting to harness state-of-the-art AR technologies within their software and Apps.

Distributed Scene Graph

Just came across this interesting video on distributing multiplayer game processing across a network. Relates quite closely to what we trying to achieve on a current PhD project. Microsoft have demonstrated an example of cloud processing using Azure in Titanfall and other games but its not quite the same.

A prevailing approach by some leading games companies/publishers is to stream gameplay from the cloud, e.g. Onlive and Gaikai, in a technological approach that could be described as streaming interactive gameplay. The game is processed in the cloud and rendered gameplay screens are streamed to client machines. The advantage of the approach is that client machines do not have the so powerful and it may be more energy efficient (in global terms). An obvious disadvantage is that QoS and QoE is dependant on the quality of network. Shinra, a Square Enix company, recognise this issue but are building their mmo engine for a future (faster) network! Article here.

Recent research papers

It’s been a busy period of research with the research exercise coming towards a conclusion and new grant opportunities appearing. Barry Herbert is in the last stages of writing up his PhD but we managed to get a paper out the door and accepted for ITAG. This paper presents results of an experiment which investigates user typology for the gamification of a virtual learning environment. We show that it is possible to distinguish between variation in the ways that learners can be motivated to engage within learning processes and content.

  • Barry Herbert, Darryl Charles, Adrian Moore, Therese Charles, “An Investigation of Gamification Typologies for Enhancing Learner Motivation”, ITAG: Interactive Technologies and Games – Education, Health and Disability, Nottingham Trent University, Nottingham, 2014

Extending these ideas and applying them within a rehabilitation context we developed a new framework called PACT (people, aesthetics, context, technology) and we are looking forward to presenting a short paper on this at ICDVRAT. PACT has an implicit focus on participatory design and involvement with all of the relevant stakeholders from the beginning of a rehabilitation design process. The emphasis on gamification within the PACT framework has a number of significant advances. Firstly, the outcome of a gamification process may not be an obvious game but may simply result in the addition of fun feedback (e.g. points and badges) to a non-game context (e.g. physical movements round the home), could recommend the use of gaming hardware in a non-game context (e.g. digital painting), or the use of game worlds to immerse and inspire (e.g. walks with friends in virtual game worlds). Secondly, new advanced gamification approaches can help tailor system design to account for diversity in motivation between different people.

  • Darryl Charles, Suzanne McDonough, “A Participatory Design Framework for the Gamification of Rehabilitation Systems”, ICDVRAT: The 10th International Conference on Disability, Virtual Reality and Associated Technologies 2014.

Craig Hull is just about to start his second year of his PhD and he recently presented his first paper (below). I’m particularly pleased that we received generally positive feedback on this approach and that we managed to get the word FRAGED into the title – Babylon 5 :-). We should see if we can get “smeg” and “frak” into future paper titles.

  • Craig Hull, Darryl Charles, Philip Morrow, Gerard Parr, “FRAGED: A Framework for Adaptive Game Execution and Delivery to Improve the Quality of Experience in Network Aware Games”, PGNET 2014.

We also just received news that our research with the Leap Motion controller for use in stroke rehabilitation was accepted for publication in Jatech. This technology has a lot of potential and we need to try to find some funding to be able to take it forward.

  • Charles, D.K., Pedlow, K., McDonough, S., Shek, K. & Charles, T. (2014). Close Range Depth Sensing Cameras for Virtual Reality based Hand Rehabilitation. Journal of Assistive Technologies

In September Dominic Holmes is starting a PhD with us at Ulster to investigate the use of modern game hardware and software to create tailored and motivating exercise programmes to help prevent falls. He will focus on adherence to exercise as a measurement of success and investigate variation in engagement mechanisms between users. The fun part will be a chance to work with Oculus Rift VR (or similar) , Leap/Kinect/Myo controllers and the Omni-Treadmill. Dominic has been shortlisted for the Creative Buzz Award for his excellent work on his final year project. He designed and made a feedback jacket for connecting to a computer game via an Arduino microcontroller – it helps provide a more immersive gaming experience by supplying haptic feedback around a player’s body, e.g. when shot a player will feel the motion of an actuator in the area of the body that correlates to where they have been shot in-game. We hope that he will be able to use some of these approaches in his PhD. Dominic will be at Culture Tech in Derry/Londonderry in September for a 4 day bootcamp – I will put more information up about this closer to the event.

Presenting our Oculus and Leap Motion Rehabilitation Research at the Joint Higher Education exhibition in Stormont

Yesterday Suzanne McDonough, Katy Pedlow (Health and Rehabilitation Sciences) and I had the privilege of presenting our recent research on the Oculus Rift VR headset and Leap Motion Controller to our local politicians at Stormont. Some pictures below.

We have been progressing from our previous simulations of traditional rehabilitation tasks using the Leap Controller to investigating and uncovering the best practices in interaction design among recent commercial releases of software. We are keen to map clinical requirements for rehabilitation exercises to existing games that contain similar movements in their controller design. We are also learning from best practices for the design of our own rehabilitation games – the main issues issue with commercial games is (obviously) that they are not tailored to treatments and are not usually adaptive to individual requirements. During this phase of our investigations we are also looking at whether the use of the Oculus Rift VR headset improves the usability and function of our rehabilitation games.  In particular can it help patients with depth perception as they reach out in a virtual 3d scene?

The demos generated a lot of interest as they have when I have presented the technology in our uni open days (see bottom picture) and also when I recently visited a local primary school.


Playing with programming the Leap Motion and Oculus Rift together

I had a couple of hours this afternoon to be able to make some progress learning how to use Oculus Rift and the Leap Motion controller together.

Its not hard when you know how and in about 30 mins I connected an Oculus camera to one of the flying Leap Motion samples. Orientation of palms used to steer ship.

I’m just getting a feel for what works and how to program these technologies. I’m not sure we can use the Oculus in our games for rehab work (certainly in games with movement of the camera) due to vertigo and similar issues, but its really interesting and fun to work with.