In this blog we will be discuss about the fundamental course of SQL which is useful for all professional be it data analyst, business analyst, data scientist to extract, manipulate or draw insights from data stored in SQL databases. We will be looking at MySQL database starting from introduction, installation and practical sessions.
1. Why do we need databases ?
The rapid explosion of data creates a necessity to have a place to store and retrieve data efficiently. We are actually leaving in age of the data.
In this blog we will be focusing on what are convolution neural networks and how do they work.
The Convolution Neural Network or CNN as it is popularly known is the most commonly used deep learning algorithm. Before we get into how CNN works let us first understand the problems faced during traditional MLP and why do we need CNN at first place.
ISSUES WITH TRADITIONAL MLP & WHY WE NEED CNN ?
Let us start with simple example,
In this project we will be solving about loan prediction problem where we have to predict whether based on customer details loan should be approved or not. Here is the glance of the data.
Tensorflow is a software library or framework, designed by the Google team to implement machine learning and deep learning concepts in the easiest manner. It combines the computational algebra of optimization techniques for easy calculation of many mathematical expressions.
This is a official website of TensorFlow : www.tensorflow.org
Let us now consider the following important features of TensorFlow:
In this blog we will be doing a project based on image classification where our problem statement describe us to classifies the images into two categories i.e. Emergency & Non-Emergency vehicle which is a binary classification problem and we will be solving using neural network.
Before diving deep into this project I would recommend you all to please go through the basics of working with images in my deep learning blog. I am sure you will like it and then proceed ahead with exciting theory and practical's coming ahead.
Coming to emergency vehicles there can various types like police cars…
In this blog we will be studying about how Deep Learning techniques can be applied to study about unstructured data like images or text. Our main focus or agenda in this blog will be to understand what are images and how we can work with image data. So, lets start our journey.
Ever imagined how are images stored in the dataset. If not, let us see how are the images stored in a computer.
In this blog we will understand about the various optimizers and loss functions which are most commonly used while training our neural networks. The prerequisite for this is to have basic understanding of how gradient descent algorithm works so, I would highly recommend you all to refer my previous blog on Introduction To Neural Network for better understanding over Gradient Descent algorithm which is used during the backward propagation for updating our weights and minimize the cost function where our aim is to reach the global minima.
After having gone through the basics of gradient descent algorithm we will look…
In this blog we will learn about the activation function which are most widely used in Deep Learning. Before jumping to the point lets recap in short about the basic architecture of neural network and understand it’s working in short.
For simplicity purpose consider the multilayer perceptron.
In this article we will be learning about following things.
So why are we waiting lets dig dive and understand the concepts behind all the above terms used in Neural Network.
“A neuron with step function as the activation function is a perceptron” . Let us understand with following example.
INTRODUCTION: WHAT IS NEURON ?
Let us start with simple example of classification problem — Loan Prediction. Our aim is to approve loan or not based on…
Your ability to explain this in a non-technical and easy-to-understand manner might well decide your fit for the data science role!
Even when we’re working on a machine learning project, we often face situations where we are encountering unexpected performance or error rate differences between the training set and the test set (as shown below). How can a model perform so well over the training set and just as poorly on the test set?
Here’s my personal experience — ask any seasoned data scientist about this, they typically start talking about some array of fancy terms like Overfitting, Underfitting, Bias…
Aspiring Data Scientist | Blogger | ML, DL enthusiastic. Having overall 3.4 years industry relevant experience.