Yeahuay Wu

I am a Math/CS major at Temple University in Philly. I'm working in Dr. Ngu's team with Brock Yarbrough and Andrew Polican. The rise of IoT represents a lot of exciting things for the future and I am stoked to be working with/testing the various platforms for IoT development that have emerged over the past few years.

My journey to CS has been odd, I've been a Economics major in the business school, a philosophy major, and a geology major all before falling in love with Mathematics and Computer Science. I would probably desire to just be a poet if it were not for the great examples that CCRU, SFPC, and even Ada Lovelace had set in terms of approaching the poetical sciences. This is good, since the sciences have given me a rigorous understanding of the world and an approach to philosophy. I appreciate and crave the balance that everything I have been exposed to has provided me with. 

June 1 - 

Dr. Ngu showed us some papers on different platforms for IoT management and we were told to read through them and figure out which was most suitable for the project we are working on. Our objective is to create an application that will detect the physical/mental state of its user by collecting their health data over a period of time. The application seems to be serve as a demonstration of the chosen framework rather than the end goal. Our task will be split into three parts: data collection, data analysis, and data visualization. We were also asked to assess what the best wearable to use for the project was. 

June 2

Day mostly consisted of reading research papers on the different platforms available for IoT. The one I like the most so far is Swarmlet because of its malleability. Dr. Ngu told us to look into a research paper about convoluted neural networks as we will be using machine learning techniques as a component of our application.

June 3

A day full of presentations on the various areas of the research the REU advisors were involved in. I learned a lot of new vocabulary that helped me out when I was reading research papers later. Met with Dr. Ngu today on the progress we've made on researching the various platforms. We told her that we wanted to use Swarmlet. She suggested we compile a list of concrete reasons why we wanted to use Swarmlet over GoogleFit or GSN since we would have to defend our decision during a presentation. She also told us to look into Particle Filtering as a possible ML-algorithm we could use. After the meeting, my group divided tasks using Trello (plug!). The whole REUIOT went to eat BBQ that night as a little welcome party. 

June 4

Spent today mostly setting up tools: getting the Ptolemy II system design software up on my machine and getting its MATLAB interface to work. I tried to run through a couple example systems available on the Ptolemy website. The examples involving the machine learning library wouldn't run. However, a couple linear regression examples did. It's cool though, since I got MATLAB working I don't really need to use the inbuilt libraries. The biggest task on our plate is to figure out the whole Swarmlet framework. The ideas presented in the Swarmlet paper were very high level so the finer details still have to be figured out. We hope to be able to write an accessor (software abstraction of a device such as a sensor) by Tuesday (June 7).

Week 7

Well, I did a great job updating this journal as I went along :(. Through the past six weeks Andrew, Brock, and I decided to use the Swarmlet to create a fall detection IoT application using the MSBand. Andrew's task was to create accessors for the application (the Swarmlet route) while Brock's task was to create a native app using Mario, the previous REU student's BAC detection app. My job for the past six weeks was to train a fall detection model. My initial tasks were grunt work like data collection (I was the only participant in my own study for a while, meaning I had to fall many times onto a mattress to get an initial training set) and package installation. It was a pain trying to get R packages installed onto my Linux box since all of my dependencies had to be manually installed. I decided to use SVM to create my model since there was already existing literature on its use and since we had limited time. Previous researchers have used it to a high degree of accuracy, although they were using more intrusive trunk mounted sensors. 

Getting features involved alot of paper reading, dissecting, and principal component analysis. At first I narrowed it down to 6 feature but soon found out that the velocity based features were causing my model to overfit. They would do great in 10-fold cross validation (95% accuracy), but when ran against a testing set, they would have awful results (14%)

My main worry was getting false positives, but it turns out my model can differentiate between falling and simply waving your hand at a fast speed. We actually ending up worrying more about false negatives, since they are more deadly. The model was unable to detect 1 fall out of 9 in the testing set, which is a good "start".

Week 8

This was the integration week, where the three of us are supposed to get all the pieces of project integrated. In the end, due to programming difficulties and lack of time, we were only able to integrate my model with Brock's native android app. Andrew had issues due to lack of available info about the J2V8 engine which coincided with lack of time. He was able to stream data from the band using the accessor he wrote, it was just streaming concurrently that he had an issue with.

It would have a dream come true if we were able to get this to work with Swarmlet technology, but we just ran into a lot of walls in the process of the project. Now we have to focus on creating a great fall detection demo for poster day and documentation so that this model can be used in high school computer science programs.

Week 9

This was the last week. The good news is that Andrew was able to get my model working on his J2V8 host. Although it doesn't adhere 100% to accessor specifications, it's acceptable. I'm stokedQ The bad news is that my model did not perform well in real life. Brock hooked up his modified app to the MS Band and loaded the model onto it. It turns out that it will detect rapid hand waving as falling. It will also detect running as falling (although we changed the implementation to fix this, more later). This was our fear in the first place. I've tried to train the model against it by adding false falls to the training set, but it only seems to be confusing the classification. This means that I will have to go back and adjust the features that I used, maybe make the sliding window longer, or even include velocity features again.

Brock used his remaining time to change the implementation of the fall detection app, basically he set it so that if the model detects 2-5 "falls" in row for each 250ms slice of time, the program would determine that whole sequence as a fall. Any less and that would be a a sudden gesture, any more and the action is probably a longer-term movement like running. This is a good temporary fix.

However, we have barely any time left, so I guess we'll have to just have a slightly working demo for poster day. The shining point of this whole week was when we were able to load my model onto Andrew's host.

Now comes the poster making and paper writing.