For a more technical explanation on neural networks, refer to one of my older posts.
With the explosion of AI across all industries, Deep Learning has been responsible for many of the major breakthroughs. But for people in non-tech, what exactly is Deep Learning?
Deep Learning is a series of machine learning models based on artificial neural networks.
Okay, we just described what is Deep Learning. However, the definition doesn't define what exactly is an artificial neural network (ANN).
by Joseph Woolf
Earlier this week, I attended the O'Reilly AI Conference up in San Jose, CA. Wednesday and Thursday started off with keynotes showcasing what companies were currently researching in the field of AI. While I'm no expert in the field, I found four key takeaways from the keynotes.
With machine learning and AI being very popular and hyped, it's not surprising that cloud providers such as Azure, Google Cloud, and AWS offer services for doing machine learning. These services often don't require the user to delve into mathematically complex topics such as convolutional neural networks and back propagation.
Currently, I'm doing training on AWS for the Associate Developer Certification as part of company training. While the likelihood of hitting machine learning services on the exam is low, I find it to be a good idea to cover an overview in general. I won't go over every services, but hopefully you would be able to distinguish the major ones.
With the rise of Siri, Google Home, Alexa, and Cortana, it's obvious that there's a demand for chatbots. In the past, chatbots were more of a niche technology due to limited functionality. With recent advancements in computer technology, chatbots have now become practical for everyday use.
Most of the models in machine learning requires working with numbers. After all, much of the machine learning algorithms we've seen are derived from statistics (Linear Regression, Logistic Regression, Naive Bayes, etc.). Additionally, machines can understand and work with numbers a lot easier than us human.
However, machines just process the numbers and execute algorithms. They don't interpret the numbers returned. They don't understand the context of the data. They especially don't understand human intricacies and can easily be taken advantage by rouge players.
So then, is it actually possible for computers to understand humans? Can we ever have conversations with computers? In a sense, we already can! This is thanks to a branch of AI called Natural Language Processing.
Let's imagine you're doing research on an ideal rental property. You gather your data, open up your favorite programming environment and you get to work on perform Exploratory Data Analysis (EDA). During your EDA, you find some dirty data and clean it to train on. You decide on a model, separate the data into training, validation, and testing, and train your model on the cleaned data. Upon evaluating your model using some validation and test data, you notice that your validation error is very high as well as your test error.
Now suppose you pick a different model or add additional features. Now your validation error is much lower. Great! However, upon using your testing data, you notice that the error is still high. What just happened?
I admit, I'm late to the whole Neural Network party. With all of the major news covering AI that use neural network as part of their implementation, you'd have to be living under a rock to not know about them. While it's true that they can provide more flexible models compared to the other machine learning algorithms, they can be challenging to work with.
For those wanting to work with Big Data, it isn't enough to simply know a programming language and a small scale library. Once your data reaches many gigabytes, if not terabytes, in size, working with data becomes cumbersome. Your computer can only run so fast and store only so much. At this point, you would look into what kind of tooling is used for massive amounts of data. One of the tools that you would consider is called Apache Spark. In this post, we'll look at what is Spark, what can we do with Spark, and why to use Spark.
One of the hottest tech disciplines in 2017 in the tech industry was Deep Learning. Due to Deep Learning, many startups placed AI emphasis and many frameworks have been developed to make implementing these algorithms easier. Google's DeepMind was even able to create AlphaGo Zero that didn't rely on data to master the game of Go. However, the analysis is much more basic than anything that was recently developed. In fact, the dataset is the popular MNIST database dataset. In other words, the dataset consists of hand written digits to test out computer vision.
In my previous post, I mentioned how there are many different algorithms that can be used to cluster a dataset. One of the most popular clustering methods used is called the k-means clustering algorithm.