Yesterday at BVN Annisa and myself conducted an experiment where we measured the RSSI values from an Estimote and a Hockey Puck. We compared the values of the RSSI as I rotated on a point every 16 degrees between 20 second intervals. Each Test took around 8 minutes and we measured the received RSSI values at 1m, 2m, 3m and 4m.
Average detection rate
We found that the Hockey puck was received almost 2 times more than the Estimote. We also found that the results between the Estimote and the hockey puck were quite similar.
Our first look at the results, we noticed there was something strange about the diagraMS. We later realised that the overlay of the RSSI values at the measured meters were offset because of the starting angle in which we measured from. To fix this, we need to breakdown the filtered data sheets and figure out at what point I was facing a specific direction and get that angle.
Another thing we noticed, is that the BLE maybe skewed as we do not know exactly where the broadcasting signal is coming from the new RPI 3.
Rewrite experiment 6 and figure out where the angles overlap on the Polar Pattern diagrams.
I also need to produce the last baseStation.json file for ARUP for Ben for all the locations of the beacons so we can implement it into his visualization thing he has going. e.g. https://github.com/ArupAus/code2016/blob/master/helpers/movement_data_baseStations_ARUP.json
We should also take a look into alternative scientific computing platforms other than google sheets(lel). Ben made a good suggestion for using ‘Pandas i python notebook’ which I will give a crack at when I redo these results from experiment 6.
It might be worthwhile to repeat experiment 6 however, turning around 5 times and seeing if the pattern or ‘hump’ recurs at that specific spot or is consistently inconsistent, either findings will be interesting or helpful.
I need to write up more about the triangles and centroids idea and experiment with using some possible problems and solutions that might come out of it. For example, the corner issue which I talked about in my last post.
I need to write up about the Beacon detection thingy options which I have been nulling over with Ben for a while.
Read this article: https://www.autodeskresearch.com/publications/personas and have a go at the Visual Motion thingy that Oasis did an demonstration for us yesterday at BVN. I want to see how it will take a series of positions and then translate that into an animation, however this is very experimental and I am not sure what exactly we will get or what we are looking for.