History of AI

Rule-based to Statistical (1950s - 2000s)

The Evolution of AI

Rule-based AI (Expert Systems, 1950s-60s)

Meaning: AI built from hand-coded rules like "if this happens, then do that."
Detailed Example:
In the 1970s, a system called MYCIN helped doctors decide which antibiotics to prescribe.
  • Rule: If a patient has a fever AND a cough → suggest a chest X-ray.
  • Rule: If infection type = bacterial → recommend penicillin.
Why it failed: Every new disease or situation required writing new rules → not scalable.

Statistical AI (1980s)

Meaning: Instead of only rules, AI started using statistics to find patterns in data.
Detailed Example - Spam filters:
  • Looks at all words in an email.
  • If words like "FREE $$" or "WINNER" appear too often → mark as spam.
  • It doesn't "understand" spam but calculates probabilities based on word frequency.
Why it mattered: For the first time, machines could learn from examples, not just rules.

Data + Compute Era (2000s)

Meaning: Internet boom = tons of data + faster computers → better AI models.
Detailed Example - Google Search:
  • Old search (1990s): matched exact keywords only.
  • New ML-based search (2000s): learns that if you type "best pizza near me", it should show restaurants, not websites with the words "best" + "pizza."
Impact: AI became useful at internet scale (search, ads, recommendations).

The Beginning of an AI Journey

What you've learned here is the foundation of AI - how we moved from rigid rules to statistical learning.

This set the stage for the machine learning revolution that would follow.