Day 9: Research groups, human-car interaction demo, and Invoke string quartet!

The second week of SAILORS commenced with a highly anticipated meeting in their research groups. Our research projects were led under the guidance of Stanford Artificial Intelligence Laboratory’s students and assistant professors.

Here’s a sneak peak of the research project meetings, NLP and computer vision, that we, Sarah and Kyra, attended.

To kick off the natural language processing (NLP) meeting, the group was introduced to the nest of NPL research, the linguistics building. Enjoying a new spacious environment, we settled into transferring last week’s Bayes Rule into the this week’s AI Naive Bayes model. Using the example of a game show, red and green balls, and two boxes, the group set off with Python-refreshers and exercises to conquer their very own Naive Bayes algorithms. After a sufficient time on Jupyter Notebook, the group moved on discuss the multiple uses of Naive Bayes and chatbots, old and new.

Soon, the NLP group will apply their new knowledge to their main project and use Naive Bayes to classify text tweeted during Hurricane Sandy.

In the computer vision research group, led by CS master students Shane and Alex, we began coding for our research project of ‘The poverty map of Uganda.’ Using Python, we coded classes in machine learning and poverty identification on Jupyter Notebook. We debated about different features that would be crucial in identifying poverty (with satellite images from the Ermon Lab) in Uganda such as roads, lights, cars, and property value.

IMG_9682.JPG

Photo by Sarah Chun.

DSC_3756.JPG

Above is a photo from the robotics research project. Photo by Lauren Yang.

After our morning research project, we had the opportunity to listen to a lecture on ‘Decoding the Human Genome to Decipher the Genomic Basis of Disease’ by Anshul Kundaje, Associate Professor of Genetics and Computer Science at Stanford University. Prof. Kundaje spoke about using artificial intelligence to decode and identify genome functions within the human DNA. Using deep conventional neural networks (CNN), we can learn about hundreds of novel made patterns in cell-specific control elements in our genes. We also learned about the future of genome sequences, which can identify possible disease associated variants within our DNA and allow the public have access to their personal genome sequence and diagnosis.

IMG_2667.JPG

Photo by Hank Tian.

Before dinner, we had a demo called ‘Human Car Interactions’ where we learned about human behavior alongside artificially-intelligent cars.  

DSC_3802.JPG

Photo by Lauren Yang.

DSC_3788.JPG

Photo by Lauren Yang.

DSC_3787.JPG

Photo by Lauren Yang.

DSC_3805.JPG

Above are the bloggers of this post. Left: Sarah. Right: Kyra. Photo by Lauren Yang.

Upon arrival, the SAILORS were met with the contemporary string quartet, Invoke. With both strings and banjos, a capella and special sound effects, the musicians bestowed a mystical, mesmerizing, and exhilarating performance. We were quick to follow up questions and contact requests, as always.

DSC_3808.JPG

Photo by Lauren Yang.

Blog post by Kyra Mo and Sarah Chun. Photos by Lauren Yang, Kyra Mo, Sarah Chun, and Hank Tian. 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s