Posts

Showing posts from 2017

BigData White Papers

I don't know about you, but I always like to read the white papers that originate OpenSource projects (when available of course :) ). I have been working with BigData quite a lot lately and this area is mostly dominated by Apache OpenSource projects.  So, naturally (given the nerd that I am) I tried to investigate their history. I created a list of articles and companies that originated most BigData Apache projects. Here it is! Hope you guys find it interesting too. :) Apache Hadoop  Based on: Google MapReduce and GFS  Papers: https://static.googleusercontent.com/media/research.google.com/en//archive/mapreduce-osdi04.pdf https://static.googleusercontent.com/media/research.google.com/en//archive/gfs-sosp2003.pdf Apache Spark   Created by: University of California, Berkeley  Papers:  http://people.csail.mit.edu/matei/papers/2012/nsdi_spark.pdf http://people.csail.mit.edu/matei/papers/2010/hotcloud_spark.pdf http://peo...

Deep Learning, TensorFlow and Tensor Core

Image
I was lucky enough to get a ticket to the Google I/O 2017 on a Google Code Jam for Women  (for girls that don't know, Google has some programming contest for women and the best classified win tickets to the conference). One of the main topics of the conference was for sure its new Deep Learning library TensorFlow . TensorFlow is Google's OpenSource Machine Learning library that runs both on CPU and GPU. Two very cool things were presented at Google I/O:  TPU (Tensor Processing Unit) - a GPU optimized specifically for TensorFlow that can be used on the Google Cloud Engine  TensorFlow Lite - a TensorFlow low weight version to run on Android and make developer's lives easier Last week, at a BigData meetup in Chicago, I discovered that Nvidia also created a specific GPU hardware for processing Deep Learning, the Tensor Core .  With all this infrastructure and APIs being made available, Deep Learning can be done considerably easier and faster. At Go...

Errors when using the neuralnet package in R

Image
Ok, so you read a bunch of stuff on how to do Neural Networks and how many layers or nodes you should add, and etc... But when you start to implement the actual Neural Networks you face a ton of dummy errors that stop your beautiful inspirational programming. This post talks about some errors you might face when using the neuralnet package in R. First, remember, to use the package you should install it: install.packages("neuralnet") Then library(" neuralnet") to load the package. Error 1 One error that might happen training your neural network is this: nn <- neuralnet(formula1,data=new_data, hidden=c(5,3)) Error in terms.formula(formula) : invalid model formula in ExtractVars This happens when the name of the variables in formula "formula1" are in a non desired format. For example if you named your columns (or variables) as numbers you would get this error. So change your column names and re-run the model! Example: label ~ 1 ...