PM & AI Chronicles

From Product Thinking to Prompt Engineering – One Tool at a Time

How AI Works πŸ€–πŸ› οΈ: The Basics of Training, Learning, and Predicting πŸ“šπŸ§ πŸ“ˆ

This article is part of the Cloud Computing & AI Foundations series, where we break down the core technologies shaping today’s digital world. For the full overview of how virtualization, cloud platforms, and intelligent systems work together, refer to the main article in this series. πŸ‘‰ Cloud Computing & AI

At the most basic level, a computer is a collection of hardware components. A programmer writes an application that tells the hardware exactly what to do.

This follows the classic Input β†’ Processing β†’ Output model.

  • Input: The computer receives information
  • Processing: It follows the rules the programmer wrote
  • Output: It produces a result

Of course, in the real world, programs can become extremely complex. There may be multiple inputs, several possible paths, and even situations where outputs become inputs into another process. Still, the key idea remains:

  • A human developer writes the algorithm, step by step, and the computer follows it.
  • The algorithm is linear β€” it has a clear start, middle, and end.

An algorithm is a set of instructions for solving a problem or completing a task. It does exactly what it is programmed to do β€” no more, no less.

Let’s look at a situation where a program is written to do a specific job, but accuracy can only be determined manually.

⭐ Example: A Program That Counts Defective Products on an Assembly Line

Suppose a factory installs a camera system that takes photos of products moving down a belt. A simple program is written to:

  • Detect cracks,
  • detect scratches,
  • count how many items are defective

The logic might be something strict like:

If scratch_length > 2 cm β†’ mark as defective  
If crack_detected = true β†’ mark as defective

After processing 10,000 items, the program reports: β€œThere were 347 defective items.”

But how do we know that number is correct? A supervisor must manually:

  • Review sample images
  • Count how many defective products they see
  • Compare it with the program’s count

If the supervisor finds 312 defects instead of 347, the program is inaccurate.

The developer must:

  • Change the code
  • Adjust the detection rules
  • Re-run the entire program again
  • Re-check the accuracy

This cycle is slow, tedious, and time-consuming.

AI changes the approach entirely. A developer still programs the initial algorithm, but now the software can:

  • learn from past mistakes
  • update itself
  • improve accuracy without rewriting the whole program

This is what makes AI β€œintelligent.” Its simplified model now looks like: Input β†’ Processing β†’ Outcome β†’ Adjustment β†’ Assessment

Unlike traditional software, AI has two new capabilities:

  • Assessment: The AI evaluates its own performance: β€œWas my prediction right? How close was I to the correct answer?”
  • Adjustment: Based on feedback, the AI adjusts its internal parameters to improve next time.

This is the core of how AI systems learn.

Let’s return to our factory example. If the AI system miscounts defects, humans provide feedback:

  • Show it the images where it got the count wrong
  • Highlight which items were truly defective
  • Correct its mistakes

The AI then:

  • Learns patterns in cracks and scratches
  • Refines its detection rules automatically
  • Reduces errors with each cycle

This adjustment + assessment loop is known as training the model.

After enough rounds, the AI reaches a high level of accuracy and is ready to evaluate new unseen data with confidence.

Cloud computing has played a huge role in making AI accessible. AI models often require:

  • Massive computing power
  • Huge storage for datasets
  • Fast processing of complex algorithms

Buying this hardware yourself would be extremely expensive. But the cloud changes everything.

  • Scalable computing power (only pay for what you use)
  • Storage for large datasets
  • On-demand GPUs/TPUs for faster training
  • Tools and platforms for machine learning

Because of this, even students, startups, and small companies can experiment with AI without owning powerful computers.

Understanding how AI works becomes much easier when you compare it to the way computers have traditionally operated. In the past, a program could only follow the exact rules a developer wrote for it. But with AI, software can now evaluate its own performance, learn from feedback, and steadily improve without rewriting the code each time.

This shift from fixed logic to self-adjusting systems is what makes AI so powerful. And thanks to cloud computing, the enormous processing and storage needs of AI are now accessible to anyone with an internet connection. Together, these two technologies have opened the door for smarter applications, faster innovation, and new possibilities in every industry.

In the next article, we’ll explore the different disciplines of AI, how they are classified, and what each type is capable of in the real world. πŸ‘‰ Exploring Intelligence