Learning is a lifelong process. But you must know what, where, and how to learn? What skills to develop? What skills will help you boost your career? If not, you are at the right place! Our tutorial section at CoderzColumn is dedicated to providing you with all the practical lessons. It will give you the experience to learn Python for different purposes and code on your own. Our tutorials cover:
For an in-depth understanding of the above concepts, check out the sections below.
A detailed guide on how to use Python Module "asyncio" and keywords "async" & "await" for concurrent programming in Python. Tutorial covers how to create coroutines using "async/await" syntax and run them in parallel using "asyncio" module. Tutorial covers topics like creating coroutines, collecting results once coroutines are complete, waiting for coroutines, canceling coroutines, retrieving pending coroutines, etc in detail.
A comprehensive guide on how to use Python library 'hyperopt' for hyperparameters tuning with simple examples. Tutorial explains how to fine-tune scikit-learn models solving regression and classification tasks. Tutorial is a complete guide to hyperparameters optimization of ML models in Python using 'hyperopt'.
A simple guide to use naive Bayes classifiers available from scikit-learn to solve classification tasks. All 5 naive Bayes classifiers available from scikit-learn are covered in detail. Tutorial first trains classifiers with default models on digits dataset and then performs hyperparameters tuning to improve performance. Various ML metrics are also evaluated to check performance of models.
A simple guide to work with dates, timestamps, periods, time deltas, and time zones using Python library Pandas. Tutorial covers aspects like creating date time ranges / time stamps / time deltas / periods / period ranges, adding / subtracting time deltas from dates / periods, adding time zone to dates, converting time zone of dates, etc.
A detailed guide to resampling time series data using Python Pandas library. Tutorial covers pandas functions ('asfreq()' & 'resample()') to upsample and downsample time series data. Apart from resampling, tutorial covers a guide to apply moving window functions ('rolling', 'expanding' & 'ewm()') to time series data as well. The rolling window, expanding window and exponential moving average is covered in tutorial.
A complete guide on how to use Python library "email" to represent emails with simple examples. Tutorial covers topics like creating a simple email message, emails with CC / BCC, emails with diff types of attachments, setting HTTP headers in emails, combining different content types in emails, etc.
A simple guide to creating sunburst charts in Python using interactive data visualization library Plotly. Tutorial explains how we can use plotly express and plotly graph objects API of a library to create sunburst charts. The sunburst chart is also referred to by other names like multi-level pie chart, ring chart, donut chart, doughnut chart, or radial treemap.
A detailed guide on how to use Python library "smtplib" to send emails (Gmail, Yahoo, etc) with simple examples. Tutorial covers various operations with mailbox servers like login / logout, verifying email ids, sending emails with CC / BCC, sending mails with attachments, etc. It uses SMTP protocol behind the scene to send emails.
A comprehensive guide on how to use Python module "signal" to send, receive and handle system (Unix/Windows) signals to signal some event. The library lets us catch signal and run a handler (callback) based on event represented by signal. The signal can be sent to different threads and processes to inform them about event and execute handler by them accordingly.
A simple guide on how to use Python module “difflib” to compare sequences and find out differences between them. Tutorial explains whole API of a module to explain different ways of comparing sequences and format results in different ways. It can be very useful to compare file contents to see differences.
Parallel Computing is a type of computation where tasks are assigned to individual processes for completion. These processes can be running on a single computer or cluster of computers. Parallel Computing makes multi-tasking super fast.
Python provides different libraries (joblib, dask, ipyparallel, etc) for performing parallel computing.
Concurrent computing is a type of computing where multiple tasks are executed concurrently. Concurrent programming is a type of programming where we divide a big task into small tasks and execute these tasks in parallel. These tasks can be executed in parallel using threads or processes.
Python provides various libraries (threading, multiprocessing, concurrent.futures, asyncio, etc) to create concurrent code.
Once our Machine Learning model is trained, we need some way to evaluate its performance. We need to know whether our model has generalized or not.
For this, various metrics (confusion matrix, ROC AUC curve, precision-recall curve, silhouette Analysis, elbow method, etc) are designed over time. These metrics help us understand the performance of our models trained on various tasks like classification, regression, clustering, etc.
Python has various libraries (scikit-learn, scikit-plot, yellowbrick, interpret-ml, interpret-text, etc) to calculate and visualize these metrics.
After training ML Model, we generally evaluate the performance of model by calculating and visualizing various ML Metrics (confusion matrix, ROC AUC curve, precision-recall curve, silhouette Analysis, elbow method, etc).
These metrics are normally a good starting point. But in many situations, they don’t give a 100% picture of model performance. E.g., A simple cat vs dog image classifier can be using background pixels to classify images instead of actual object (cat or dog) pixels.
In these situations, our ML metrics will give good results. But we should always be a little skeptical of model performance.
We can dive further deep and try to understand how our model is performing on an individual example by interpreting results. Various algorithms have been developed over time to interpret predictions of ML models and many Python libraries (lime, eli5, treeinterpreter, shap, etc) provide their implementation.
Data Visualization is a field of graphical representation of information / data. It is one of the most efficient ways of communicating information with users as humans are quite good at capturing patterns in data.
Python has a bunch of libraries that can help us create data visualizations. Some of these libraries (matplotlib, seaborn, plotnine, etc) generate static charts whereas others (bokeh, plotly, bqplot, altair, holoviews, cufflinks, hvplot, etc) generate interactive charts. Majority of basic visualizations like bar charts, line charts, scatter plots, histograms, box plots, pie charts, etc are supported by all libraries. Many libraries also support advanced visualization, widgets, and dashboards.
Basic Data Visualizations like bar charts, line charts, scatter plots, histograms, box plots, pie charts, etc are quite good at representing information and exploring relationships between data variables.
But sometimes these visualizations are not enough and we need to analyze data from different perspectives. For this purpose, many advanced visualizations are developed over time like Sankey diagrams, candlestick charts, network charts, chord diagrams, sunburst charts, radar charts, parallel coordinates charts, etc. Python has many data visualization libraries that let us create such advanced data visualizations.
Deep learning is a field in Machine Learning that uses deep neural networks to solve tasks. The neural networks with generally more than one hidden layer are referred to as deep neural networks.
Many real-world tasks like object detection, image classification, image segmentation, etc can not be solved with simple machine learning models (decision trees, random forest, logistic regression, etc). Research has shown that neural networks with many layers are quite good at solving these kinds of tasks involving unstructured data (Image, text, audio, video, etc). Deep neural networks nowadays can have different kinds of layers like convolution, recurrent, etc apart from dense layers.
Python has many famous deep learning libraries (PyTorch, Keras, JAX, Flax, MXNet, Tensorflow, Sonnet, Haiku, PyTorch Lightning, Scikeras, Skorch, etc) that let us create deep neural networks to solve complicated tasks.
Image classification is a sub-field under computer vision and image processing that identifies an object present in an image and assigns a label to an image based on it. Image classification generally works on an image with a single object present in it.
Over the years, many deep neural networks (VGG, ResNet, AlexNet, MobileNet, etc) were developed that solved image classification task with quite a high accuracy. Due to the high accuracy of these algorithms, many Python deep learning libraries started providing these neural networks. We can simply load these networks with weights and make predictions using them.
Python libraries PyTorch and MXNet have helper modules named 'torchvision' and 'gluoncv’ respectively that provide an implementation of image classification networks.