A brief introduction to Machine Learning

Glen Fernandes
8 min readDec 15, 2020

This article is mainly directed towards the uninitiated and somebody who wants to get a brief understanding of what all the buzz is about. This is my research on some of the aspects in ML.

Source: unspalsh.com

Topics I will be covering in the article :

  1. What is machine learning?
  2. How Machine learning is changing the programming paradigm.
  3. Is it really statistics in the background?
  4. The secret behind the rise of the ML.
  5. Recent developments.
  6. AI bubble or AGI ?

What is machine learning?

Machine learning is a part of artificial intelligence which provides systems the ability to learn from the data without explicitly programming it.

Machine-learning algorithms use statistics to find patterns in massive* amounts of data. And data, here, encompasses a lot of things — numbers, words, images, clicks, what have you. If it can be digitally stored, it can be fed into a machine-learning algorithm. — MIT tech review

Some examples of Machine Learning being used in major companies:

  1. Uber uses Natural Language Processing(NLP) and Deep Learning models to improve customer support.
  2. Netflix uses machine learning to improve user experience by recommending the shows user would like to watch based on their viewing history.
  3. Facebook uses machine learning to filter out spam and misleading content using NLP, automatically captioning videos using speech recognition systems and targeted advertising by profiling user using data storage companies like epsilon, BlueKai etc.
  4. Apple, Google , Amazon use machine learning for their virtual assistants Siri, google assistant and Alexa respectively to make user experience seamless.
  5. Companies like Tesla, Comma use deep learning to deploy self driving experience in cars.
Source: Seekingalpha.com

How machine Learning is changing the programming paradigm

Traditional programming has always been about programming with some desirable behavior in mind. For example designing a calculator, databases, websites.
Machine learning programming is more about finding the patterns or relations between input and output. For example figuring out recommendations based on user experience, suggestive texts in phone keyboards based on previous history etc.

This picture from the book Deep learning with python by Francois Chollet captures the difference between traditional programming and machine learning.

Classical programming vs Machine learning.

Machine Learning is used extensively in many major companies to improve their application or take better business decisions that will generate greater revenues based on data.

Neural networks are not just another classifier, they represent the beginning of a fundamental shift in how we write software. They are Software 2.0.

Source: Article

The “classical stack” of Software 1.0 is what we’re all familiar with — it is written in languages such as Python, C++, etc. It consists of explicit instructions to the computer written by a programmer. By writing each line of code, the programmer identifies a specific point in program space with some desirable behavior.

Traditional computer programming focused on certain conditions that were programmed into the computer followed by the data given to obtain solutions but what if there were some anomaly in the data?. Machine learning helps to solve this problem by accounting for this anomaly.

This article from Andrej Karpathy on software 2.0 explains how machine learning is changing the way we write programs.

Differences between Statistical modelling and Machine Learning

Many of the machine learning algorithms like linear regression, logistic regression, decision trees , SVMs were statistical models that were available in the 19th century. So is machine learning and artificial intelligence just glorified statistics or is there more to it.

Source: sandserif

Machine learning is a subfield of computer science and artificial intelligence which deals with building systems that can learn from data, instead of explicitly programmed instructions.

Statistical modelling,

is a subfield of mathematics which deals with finding relationship between variables to predict an outcome.

The framework of ML is imbued in statistics. The difference in both the fields is seemingly growing larger week by week.

Although there are differences in the approach to predictive modelling of both statistical modelling and machine learning and differences pertaining to jargons, many of the existing ML models stem from statistics and one definitely needs to have a strong foundational understanding of statistics to be successful at ML.

The rise of artificial intelligence

Why did artificial intelligence rise during recent times was my next question?

Every time I opened youtube there was a video on AI, Advent of AI and should humans be scared? and many others along similar lines. Considering how transformative AI has been it definitely seems like it has disruptive powers both for the good and the bad.

The popularity of machine learning and deep learning has risen because of 2 factors mainly:

  1. Data:
    The amount of data being generated in recent times is ridiculous in amount. Mainly due to increase the IoT devices and more and more people getting access to the internet. Here are some statistics as of 2020
Source: Techjury.net

visit this link to get more stats about the amount of data being generate : Data generation.

2. Graphic Processing Unit (GPU)

GPU is a processor that is good at handling specialized computations as opposed to CPU are processors which handle general computations on our electronic devices. GPUs can be much faster than CPUs but only at specific type of computations, one which can be done in parallel.

Parallel computing refers to the process of breaking down larger problems into smaller, independent, often similar parts that can be executed simultaneously by multiple processors communicating via shared memory, the results of which are combined upon completion as part of an overall algorithm.

Parallel computing. Source: omnisci.com

Number of smaller, similar independent parts the larger problems can be divided into depends upon the number of cores the processor has. Generally CPUs have 4, 8,16 cores whereas the high end GPUs have cores in the range of 1000s which deliver high computing performance. The GPUs have substantially evolved in recent times thanks to the gaming industry. GGWP!
History_of_GPUs, this article illustrates on the history of GPUs.

The deep neural networks are embarrassingly parallel which means that lot of the computations can be done in parallel to speed up the process.

GPU manufacturing company Nvidia which is a pioneer in the the field has developed libraries to support GPU with general purpose programming and combines with other libraries such as pytorch, tensorflow to complete training models in hours and days which otherwise would have taken weeks.

Recent developments

Artificial intelligence has made major inroads into many of the fields today. These are some examples of how major of breakthroughs there have been in recent times.

  1. DeepMind technologies which was acquired by Google in 2014 developed AlphaZero a model which defeated the world champion in the game GO Lee Sedol. This article illustrates the whole story.

2. OpenAI which has Elon Musk as one of the founders built GPT-3 model based on decoder from transformer architecture, has been able to produce human-like text by training on vast amount of natural language text.

3. DeepMinds AlphaFold model has solved the 50 year old grand challenge in biology called protein folding problem in what has been called a major breakthrough moment. Many major implications for drug design and environmental sustainability has been envisaged. This article illustrates the recent miracle.

Follow this article for other major developments from 2010 to 2019 — article

AI Bubble/Upcoming AI winter or Artificial General Intelligence(AGI)

After all this buzz, funding and some major breakthroughs sometimes it begs the question that whether we are in a AI bubble or is AI, The one?

Many business reviews claim that we are definitely in a AI bubble, but the bubble is more similar to the dot com bubble rather than the housing bubble and the effects of bursting the bubble wouldn’t wreak havoc.

A still from the movie WALL-E (2008). Source: movies.disney.com

The AI winter was a period of reduced interest and funding in artificial intelligence research. The term was coined in 1984. There were Two major winters from 1974– 1980 and 1987 to 1993 and several smaller episodes later. But with recent developments and amount of money being poured into the field we can only speculate how the world will be in another 10 years.

Artificial General Intelligence refers to a all powerful computer system which can complete all the task a normal human would be able to do. Although at this point we can only speculate it definitley seems some time away from happening. There has been research going on to derive models based on mimicking of human brain.

Considering the first iPhone was released in the year 2007 followed by other market competitors in the subsequent years. How it has only been 13 years since then and how so much has been developed ranging from voice assistants to self driving cars and augmented realities. The only thing more astonishing is the how humans are adapting to change so quickly.

Note:

This is more of a literature review of the various articles on the internet majorly from medium based on how I have developed my understanding of the subject. Most of the thoughts here are not mine. This is my journey into the bright, vast, probabilistic and promising world of artificial intelligence in general.
I plan to edit and write this again and again as I gain a better holistic approach and a good understanding of the entire subject that is evolving at a rapid pace, which I do not at this point. So I will write it to my understanding now only to be updated as I learn more.
Any constructive criticism is appreciated.

--

--