Posted on Feb 11, 2023

## Mastering NLP Tokenization Techniques: A Comprehensive Guide for Effective Text Analysis

Posted in Machine Learning

Natural Language Processing (NLP) is a field of artificial intelligence that deals with the interaction between computers and human language. It has become an essential part of today’s digital age, where businesses and organizations rely on automated systems to analyze large amounts of textual data. Tokenization in NLP is a key step, which involves...

Posted on Mar 12, 2022

## Why ReLU is not differentiable at x=0 (zero)?

Posted in Machine Learning

ReLU is one of the widely used activation functions. For any $$x > 0$$, the output of ReLU is $$x$$ and $$0$$ otherwise. So,\ We can also write it as $$ReLU(x) = max(0, x)$$. For rest of the post let’s say $$f(x) = ReLU(x)$$....

Posted on Nov 27, 2021

## Weight Decay: basics with implementations

Posted in Machine Learning

Whoever is out there, working with Machine Learning models, overfitting must be a challenge for us. We can overcome this challenge by going out and collecting more data. But this can be costly, time-consuming, or sometimes even impossible for individuals! So, what do we do? We can follow some regularization techniques. Weight Decay is...

Posted on Oct 01, 2021

## Byte Pair Encoding (BPE) and Subword Tokenization

Posted in Machine Learning

In almost every application related to NLP, we use text as a part of the data. To the models, the input is generally a list of words or sentences like “We will live on Mars soon”. To a model, we feed the text as a sequence of tokens. The tokens can be characters, space-separated...

Posted on Jul 16, 2021

## Language Model Integration in Encoder-Decoder Speech Recognition

Posted in Machine Learning

Currently, Attention-based recurrent Encoder-Decoder models provide an elegant way of building end-to-end models for different tasks, like automatic speech recognition (ASR), machine translation (MT), etc. An end-to-end ASR model folds traditional acoustic model, pronunciation model, and language model (LM) into a single network. An encoder maps the input speech to a sequence of higher-level...

Posted on Jul 10, 2021

## Machine Learning Metrics: When to Use What

Posted in Machine Learning

Throughout the evolution, several metrics have been introduced to evaluate the performance of a Machine Learning algorithm or model. Sometimes it can be tricky to choose the correct metric for evaluating our model. In this article I have tried to discuss some basic matrics used in ML-related tasks and when to use which metric....

Posted on Jul 04, 2021

## What is Label Smoothing in Machine Learning?

Posted in Machine Learning

Nowadays for a lot of tasks, we are using deep neural networks or deep learning. While working with deep learning very often we face situations like overfitting and overconfidence. Overfitting is relatively well studied and can be tackled with several strategies like dropout, weight regularization, early stopping, etc. We have tools for tackling overconfidence...

Posted on Jul 03, 2021

## What is Overconfidence in Machine Learning?

Posted in Machine Learning

In machine learning, overfitting is a very commonly used term. But what is overconfidence? Let’s talk about this today with some solid background and examples. Well, if we think about overconfidence in our general life – what pops in our mind about it? It’s being excessively confident about something. Like we are driving a...

Posted on Jul 02, 2021

## Cross-validation: KFold and StratifiedKFold with examples

Posted in Machine Learning

In this post, we are going to discuss KFold and StratifiedKFold with some real-time examples. While working with a supervised machine learning model, we have some data with features and labels or targets. During the training, we give some portion of the data and keep a portion for test the model. Thus, we need...

Posted on Jun 25, 2021

## What is Tensorflow Hub?

Posted in Machine Learning

In the simplest words, TensorFlow Hub is a repository. Many pre-trained models are available in the repository. So, these are ready to fine-tune and deploy anywhere. It includes text embeddings, image classification models, TF.js/TFLite models, and many more....

Scroll to top