If You Torture the Data Long Enough, It Will Confess and giving good insight.
Hey we’re going to understand how to learn data science. And basically, what how I’m going to explain is that. How? What is the process that I have actually applied for my transition towards data sense? The similar way I’ll basically explaining you. So, to begin with in the centre I have Data Science. Now in Data Science you have to know various things like at least one programming language. I would like to rank Python as first, then R and then Java but. I would prefer Python or are because they have a lot of libraries and with the help of those libraries you can implement various machine learning algorithms then. When we consider machine learning, in machine learning there are various techniques like supervised, unsupervised machine learning technique in reinforcement machine learning technique and many more. So, in that you will basically be having problems like classification problem, regression, Florham reinforcement learning and as you know that deep learning is a subset of machine learning so I mentioned aware deep learning. Apart from that we also have clustering algorithm. I would also like to specify clustering algorithm. So, most of your problem statement actually revolves around this kind of scenarios only and you basically be using machine learning for that. And deep learning is a subset of machine learning where you’ll be actually implementing those with the help of neural networks. And then after that you need to know some tools like IDLE Basically, what kind of editor you’re basically using for coding. For coding this Python and R programming languages and there are many. So, for Python I’ve just taken an example. There is a tool called a PyCharm which is very very nice. You have Jupiter and you also have something called a spider. Now these are all tools are basically present. Apart from that for our programming language you have our studio. A for Python also you have something called as Villa Studio. Visual Studio that is also very good. Nice tool basically where you can basically. You can debug your code and do a lot of stuff. So, you need to have a knowledge of one of these IDLE if you’re going out with Python. Make sure that you have one of these if you’re going ahead with us know how to work with our studio because that is also a very good Itchen you need to know web scraping. So, this will actually help you to read some data directly from some URL. You know in the form of Jason. Apart from that also you have various libraries in machine learning like pandas and number that will also help you to do it. You need to have some basic knowledge of web scrapping apart from that match. Specifically, I would like to focus on statistics, linear algebra and differential calculus because most of the algorithms basic is basically on this particular concept itself. Then you’ll also need to know data visualisation. And basically, have written Tableau, Power BI. So, these are you know, different tools where you’ll be able to do a lot of data visualisation stuff. Apart from that in Python And in are you have different libraries at matplotlib, seaborne which will help you to actually do a lot of visualisations with respect to your code that you’re basically writing. And then you go to the data analysts state also this is also very important cell where you do feature engineering, data wrangling, explore the data analysis and a lot of things now. From all these components that you’ve basically seen, you have to follow one one thing at least. So, suppose in this select a programming language you should know all these particular algorithms and try to learn each and every algorithm understand the match behind them. It is not like you have to just solve it somewhere. It no just understands how this is basically getting implemented cause the main thing is your data. Based on this you will be using different different techniques. Suppose I have a use case where I need to predict the house prediction for a particular city. So, what are the things that I’ll require? First of all, I require data, I may take it through web scrapping. I may be dependent on some third-party APIs. Then after that I may do some data analysts on that particular day. I like feature engineering, data wrangling, exploratory data analysis. When I am selecting the algorithms, I will actually do data analysis. Along with that I’ll just see I’ll just also apply some kind of maths to select a particular algorithm, perform that algorithm with the help of the mathematical techniques that I have learned in this and then I’ll be able to implement that apart from that before implementing this particular machine learning. We may use some libraries like Matplotlib, Lipsey, Born, Tableau, Power BI to actually understand about the data, how the data is basically distributed, what form the data is basically distributed, whether it is forming a normal distribution, whether it has standard normal distribution. Can I convert that into standard normal distribution? You know, whether I have outliers? Particular data, whether the data is imbalanced, whether there are lot of things that you can basically rule, you know take out more information from that particular data. Now the main stages this particular data analysis. It is not like. You have to study everything separately.
Suppose I’m taking the house prediction. What I’m going to do with that? First of all, I require data. Now currently I don’t have to do this also. Because data is readily available. first use case since you’re practising. Since you’re making a transition career towards data science, it. So, for that you take a use case. Try to understand what that particular use case is all about you know, and then apply all these techniques whatever you require an as usual. The first technique that you’ll basically be applying is data analysis. Then the second technique that you may do is that after this data analysis will also be visualising the data to understand more about the data. The third thing is that you will basically be doing a machine learning algorithm, selecting, seeing that whether it is a classification problem, regression problem, what algorithm you’re going to apply, everything will be coming over here, and then obviously you’ll be selecting one ID that is for sure. You’ll be using PyCharm Jupiter Spider, one of the best-known IDs for doing Python programming. Which are implementing any machine learning algorithm. Even if you if you if you remember like if even lot of companies like Amazon which is basically a double S cloud, Azure is basically providing integrated Jupiter ID so where you can basically code it and deploy it in directly into the production. Again, deployment part is completely different. This deployment part you’ll have one more scenario was. This will be with respect to your deployment, now in this particular deployment what you are going to use is basically different different tools like a W SRGR or you may also use Park. it should not be spark because spark comes into big data so HWS Azure you may basically you some suppose if I take an example of a WSU may take a EC2 instance and deploy a flask model over there. You should actually integrate your model with the flash camera and try to upload it in a double S and create an API so that it can be consumed in the front end. So how did I study? How did I learn? First of all, I took a very basic use case. I as as I took the use case. I was basically you know, reverse engineering each and every step of this, how the data analysis was done. So let me just remind you that one of the use cases already done regarding house price prediction that was freely available. First your skills that I did was with respect to Irish data set. Irish data set. So basically, we need to classify what kind of iris flower that is based on the sample length and sample width. and this particular data set. The solution was clearly given in the Internet itself. Then I explode it, done reverse engineering, did the data analysis part, did the visualisation part, came to know a lot of things and initially when I was learning Python also. You should remember that I was not perfect in pandas using ponders, NUM pie and all. It is all through reverse engineering that I focused. I try to understand the subject more much more properly and I got more and more knowledge. Now when you do reverse engineering also you will be able to understand a lot of stuff. Now one example I did with iris did data set. Now whatever domain I was working with respect to the business knowledge, I could actually. Apply the same use case to that also. I could create and you use case so that I can solve that use case with the help of machine learning or deep learning or let it be data science itself. So, I was able to create a data science project and I was able to do it and that is how even though you’re working in some different domain, if you give an idea that you can solve this particular problem with the help of machine learning. That will be a very great use to the company people, even your managers will actually appreciate that popular work because you’re able to give them you’re trying to solve a particular problem with through the machine learning techniques itself. so, this is how you have to go. You have to basically do a reverse engineering, try to follow this pattern, See the use case. It is not like every project can be created. Data sense project, understand A use case and based on that use case, can you solve the problem and for that what are the steps that you’re basically going to apply? And this is the whole diagram that I’ve drawn in front of you. I’ve included everything and you have to become per fictional, you have to bring some perfection based on the reverse engineering that you’re doing in various things. Let me just give you one more example. So, for the first time when I was handling category feature, I just used to know about one hot encoding. now later on I got scenarios wherein I have many category features and if I was performing one hot encoding it was unnecessary creating so many columns. So, I found out a different way how to handle that scenario and as I said that. It is it all reverse engineering. I was busy. I was basically doing a use case and that problem came at that particular point of time. So, I thought how can I solve this? I did a lot of research and finally got a lot of inputs from the Data sense community. You can handle this like this. There are a lot of competitions that was done. I saw Kaggle competition. I checked out kernels that are freely available and I was able to get in. Knowledge okay and that is how you have to do reverse engineering in each and every state and try to fix all. I may try to resolve the particular problem and come up with a very good accuracy. The more you do reverse engineering the better lettuce.
Data Science Courses Online & Training Course
Data science is an interdisciplinary profession that integrates mathematics and statistics, specialized programming, advanced analytics, AI, and machine learning to turn raw facts into actionable insights. Learn Complete Data Science from best online training institute Durga Online Trainer.
You can join also online Courses-