Saturday 9 May 2015

UIST 2015 Paper and video

We were able to write a paper for the UIST 2015 conference , this is the video we made for the demo





Tests before the Demo Day

Team Touch it is the name of our work :)

Few days before the Demo day on 27 April, we tested it with two application we developed

1. Whack a mole : where each user assign to different colour (red,blue and brown)  and can only wrack the mole with the same colour while other users are ignored, the brown one can be wrecked by any user.
The Moles identified as +2 and +3 require collective input by two or three users , if the input  less than the number on the Moles, will be ignored.




2. Shared canvas: the second application is a shared paint canvas where assigns a different color to each user. The contributions of each individual to the shared workspace is tracked by color. Collective contributions by two or more users result in the combination of the users’ individual colors (red and blue yield purple, green and blue yield yellow.








Monday 30 March 2015

WE MADE IT :)

Using a Python program for simple game that demonstrate the idea of identify different frequencies, this was a big step and huge success. The tank game which perform different tasks based on different frequencies that send from the AD8950 and the Arduino, these signal detected by a conductive sheet and then send the signal to play the game, or send it to the Picoscope oscilloscope to display it.


the video below shows that:



AD9850 with Arduino to generate frequency

The AD8950 along with the Arduino can be used to generate a frequency that can be set at any required value and this is what we did.

Information about the AD9850 can be found here
http://www.analog.com/media/en/technical-documentation/data-sheets/AD9850.pdf


By help of Andy who tested it first and the following tutorial
http://webshed.org/wiki/AD9850_Arduino
we have made three devices made of Arduino and the AD9850 as follow:
1- Using Arduino software to write simple code to control the AD9850 chip, the code can be found on the website above.

2-Using few cables to connect the Arduino with the AD9850.

3- from the ZOUT of the AD9850 we can use to detect the frequency and we test it that using the oscilloscope.



Monday 23 March 2015

Circuits tests and display application


WorkShop: 
Date:23 March 2015
Time: 3-6pm
Location: queens Building Lab.


In this session , some of use were trying to work on there circuit that they made to generate frequency as well as using passband filters that can help to identified the different frequencies , however the BIG lab offered to use their USB oscilloscope so we go for it.

at the other hand me and Andy worked on a simple Arduino serial application to display the result of the frequency detections



When the frequency generate, it will be sent as a serial of 0 (if there is no touches ) and 1 (if there is touch from the user) , the application based on these input can identify the users.

Friday 6 March 2015

workshop 3 - Oscilloscope session

Date: 2 March 2015
Time: 3-5 pm
Place : Queens Building lab.

After we decided to work with signal transmission through the human body and test the possibility to generate a signal unique for each user and transmuted though the body , so in the workshop we tried to test the signal transmission through the body, for this test we used function generator and an oscilloscope.






After few test, the idea looks like promising . A 10Mhz frequencies can be transmit through the human body.Moreover, the signal also can be transmit though chain of five people.

So, we got more excited as we saw the possibilities.. we've discussed how our device going to look like , its simply will be a wrist device or band that user wear and generate frequency, more users wear the same device everyone generate different frequency that can be detected by Oscilloscope, the users can be identified abased on their unique frequency. 



WorkShop 1

The First workshop for interactive devises unit
Time: Monday 9 Feb 2015
Place: Queens Building lab.

during this workshop we have been grouped  to work together for the coursework, my group are five student including me.

Group name: not formed yet  :/
Group members : Me :) , Sven, Paul, Andy, Ethan.


The workshop was a brainstorm to comes up with ideas to work on for the coursework , through the discussion, different ideas  suggested and developed, for example,  clothing infer (i.e T-shirt to refer what the user feel), suitcase follow , Deadline informer , bike sensor (to tell the distant  to the cars) , a watch to monitor the temperature. The discussion last for two hours, and at the end a few paper suggested to read for further information to choose the project which we will work on.

collection of pictures from the first workshop:











Wednesday 4 February 2015

Robot Gestures Make Difficult Tasks Easier: The Impact of Gestures on Perceived Workload and Task Performance

      Due to the importance of the gestures as a non-verbal signal in human communication, it has been added to human-computer interaction field.
The research examine the highly influence of the robot gesture and showed by experiment how it can contribute to help the user’ task performance and perceived workload. The HRI direction-giving experiment that this research addressed discovered how the gesture positively helps to recall the information, however, the experiment lead to two assumptions:

1. The first assumption shows that the performance of the participant will be better by using the robot gesture in a difficult task but not with the easy one.
2. Another assumption says that the people perceived workload would be lower with the help of robot gesture whether the task is easy or difficult.



In this experiment, the robot Nao has been used [figure 1]
Figure[1]Gestures of the Nao robot


The experiment is a memory game with number of participants (all of them are students) and the robot Nao. The robot has told a random set of directions to the participants to navigate a building and later they have been asked to recall it. Same instructions and directions are given to all the participants.The researcher used a computer connected to the robot in order to control it.
In addition, the experiment made with different robot behavior for example with gesture condition and without it, also with different task for the participant, such as easy task and difficult one.
The result shows the significant support of the robot gesture for the human.