Time Line - Historical events

AI Time Line


AI has come a long way since the invention of the camera in 16 AD. Here are some of the key milestones in the history of AI.


Color Codings :


Computer Vision - Green

Deep Learning - Blue

Optimization - Yellow

AI - Cyan

Camera - 16 AD

Invention of Camera - copy vision (world).

1959 - Stimulus Cat Experiment

Hubel & Wiesel showed building blocks of vision is recognizing structures and shapes (lines & orientation)

Perceptron - 1957

Invention of Learning representations by back-propagating errors.

1969 - Minsky and Papert

Showed that Perceptrons could not learn the XOR function Caused a lot of disillusionment in the field

Perceptual Grouping (Edge detection) - 1977

Normalized Cut - Shi & Malik.

~ AI WINTERS

Neocognitron - 1980

- Fukushima

1986 - Backprop

Introduced backpropagation for computing gradients in neural networks Successfully trained perceptrons with multiple layers

1998 - LeNet

Gradient Based Learning Applied to Document Recognition by LeCun Paper Link

1999 - SIFT

recognizing objects by learning important features - David Lowe.

Face Detection - 2001

Voila & Jones , localisation of faces, first deployed face detection (real time).

2009 - ImageNET

image-net.org 1.4M dataset over 22K Categories, crucial for deep learning advancements by Fei-Fei , Deng, Dong & Socher.

AlexNet (CNN) - 2012

Breakthrough in deep learning for image classification. Won ImageNet competition. The network achieved a top-5 error of 15.3%, more than 10.8 percentage points better than that of the runner-up.

2013 - RMSProp

Proposed in Hinton's Coursera lectures. Improved gradient descent with adaptive learning rates.

VGGNet - 2014

Introduced deeper architectures using smaller kernels (3×3).

2014 - Adam Optimizer

Kingma and Ba proposed Adam, combining momentum and adaptive learning rates.

ResNet - 2015

Introduced skip connections to train very deep networks effectively.

2015 - Batch Normalization

Normalized layer activations to improve convergence.

Transformers - 2017

Attention is All You Need. Introduced attention mechanisms that revolutionized NLP and CV.

2017 - AdamW

Decoupled weight decay regularization for improved optimization in deep networks.

EfficientNet - 2019

Scaled models systematically with compound scaling, achieving state-of-the-art results.

2019 - RAdam

Rectified Adam to stabilize training during warm-up phases.

Vision Transformers - 2021

Used transformers for image processing, achieving remarkable results without CNNs.

2023 - Lion Optimizer

Introduced by symbolic discovery, using momentum with sign-based updates.