This milestone marks our Alpha Build of the Kendo Gakko Application. I spent about 12 hours working on the socket.io networking, creating a basic lander page that allows user to navigate to the different test environments, researching device detection in A-Frame, and initializing Spatial Sound Effects.
This sprint I focused on creating the template for our network communication between our Master and Student environments. On the Master Test page there are 5 buttons; 1, 2, and 3 for creating the sequence, as well as Send Sequence, and Clear Sequence for doing just as they are titled. Inputting a sequence then pressing the Send Sequence button will send the sequence to the student page as an array of numbers. On the student's page there are 4 buttons; 1, 2, 3 for inputting their response to matching to provided sequence, and Send Response for sending the response back to the master. After the student sends the sequence back it is compared with the original sequence and then an array is created that contain true or false values that correspond to each input of the sequence. If the student matched the corresponding input correctly, it will be true, if they did not match it, then it will be false. The true/false sequence is shown to both users.
This information is only shown in the console currently as it is a test framework.
In order to allow users to easily navigate to the Master environment, Student environment, and the VR environment, I created an extremely basic lander page that has buttons that link to the different pages of the Alpha Application.
I also did some preliminary research into device detection in A-Frame. The plan is to have a single button on the lander page for the application that when pressed will detect the user's device and navigate them to the appropriate page depending on the device; Desktop/Oculus Rift will navigate to the VR environment, and mobile devices will navigate the user to the master mobile environment.
I also began work on spatial sound effects this sprint, starting with the proxy kendo sword as it falls to the ground and bounces slightly on app launch. I have it so it emits a sound when it collides with the floor. This is for testing and will be altered.
Next Sprint I will continue working on the spatial sound effects, integrating the student-test environment with the current VR environment, and implementing collision detection with the different parts of the dummy to act as the responses for the sequence.
Comments