Mind Readers are here! Using these, it should be much easier to see what’s going on. Simply point one at a pet and it will tell you what they need to be happy, and what they’re missing.

In VR, it’s already integrated into the inspection tool. On flatscreen, the interact button will now pull up the tool if you’re not pointing it at anything.

This is the start of a new tool system, and an integration of the VR and non-VR. Moving forwards, I should only need to make one version of each tool, and it will work in both modes!

What’s Next?

Today marks three weeks since the early access release! The current development pace is going well, with a week of bugfixing, a week of playing half life and doom, and a week working on the mind reading. The next update is going to add a separate tutorial area. Even if you already know how to play the game, it will have some dialog and worldbuilding to help you figure out why you’re doing all this! Also some great music.

Check it out on steam!