• Skip to primary navigation
  • Skip to main content
  • Skip to footer

Codemotion Magazine

We code the future. Together

  • Discover
    • Events
    • Community
    • Partners
    • Become a partner
    • Hackathons
  • Magazine
    • Backend
    • Frontend
    • AI/ML
    • DevOps
    • Dev Life
    • Soft Skills
    • Infographics
  • Talent
    • Discover Talent
    • Jobs
    • Manifesto
  • Companies
  • For Business
    • EN
    • IT
    • ES
  • Sign in

Diego PetrecollaNovember 26, 2025 3 min read

Artificial Neural Networks: Biological Inspiration Behind Deep Learning

Deep Learning
facebooktwitterlinkedinreddit

Artificial Neural Networks (ANNs) are the foundation of modern Artificial Intelligence.
In this guide, you’ll learn what neural networks are, how they work, and why they power today’s most advanced AI systems—from image recognition to large language models like Transformers.

Whether you’re a student, developer, or simply curious about AI, this article will give you a clear and practical understanding of how neural networks learn patterns from data.

Recommended article
aziende ai
July 1, 2025

Chain-of-Thought Prompting: the trick to help AI think better

Orli Dun

Orli Dun

Deep Learning

What Is an Artificial Neural Network (ANN)?

An Artificial Neural Network is a computational system inspired by the structure of the human brain.
While it doesn’t think like a biological brain, it learns to perform complex tasks by analyzing large datasets and identifying patterns automatically.

Neural networks are widely used in:

  • Computer vision
  • Natural language processing
  • Speech recognition
  • Autonomous systems
  • Fraud detection

At a high level, an ANN is a mathematical function composed of interconnected layers of artificial neurons.


How a Neural Network Works (Simple Breakdown)

Each artificial neuron performs four basic steps:

  1. Receives inputs
  2. Multiplies them by weights
  3. Adds a bias
  4. Passes the result through an activation function

This allows a neural network to approximate extremely complex relationships between inputs and outputs.


Core Components of a Neural Network

1. Neurons (Nodes)

The fundamental processing units. Each neuron takes inputs, performs a calculation, and produces an output.

2. Weights and Biases

Weights determine the importance of each input.
Biases shift the activation threshold.

Together, they are the model’s learnable parameters.

3. Activation Functions

These functions allow the network to learn nonlinear patterns, essential for tasks like image classification, language modeling, or sound recognition.

Popular activation functions include:

  • ReLU
  • Sigmoid
  • Tanh
  • Softmax

Neural Network Architecture: Input, Hidden, Output Layers

A standard ANN consists of:

Input Layer

Where raw data enters the model—for example, pixels from an image or features from a dataset.

Hidden Layers

These layers extract patterns, features, and internal representations from the data.

When a model has two or more hidden layers, it becomes a Deep Neural Network, forming the basis of Deep Learning.

Output Layer

Produces the final prediction—for example, class labels like “cat” or “dog.”


How Neural Networks Learn: Forward Propagation + Backpropagation

The learning process involves two key phases:

1. Forward Propagation

  • Data flows from input → hidden layers → output
  • The network produces a prediction

2. Backpropagation

  • A loss function measures how wrong the prediction is
  • The error flows backward through the network
  • Gradient Descent updates weights and biases to reduce error

This cycle repeats for thousands or millions of iterations until accuracy is optimized.


Practical Example: Building a Neural Network From Scratch (Python + NumPy)

This simple example solves the classic XOR problem, demonstrating how a neural network learns when linear models fail.

import numpy as np

# XOR dataset
X = np.array([[0,0],[0,1],[1,0],[1,1]])
y = np.array([[0],[1],[1],[0]])

# Weight initialization
np.random.seed(42)
W1 = np.random.randn(2, 2)
b1 = np.zeros((1, 2))
W2 = np.random.randn(2, 1)
b2 = np.zeros((1, 1))

def sigmoid(z):
    return 1 / (1 + np.exp(-z))

lr = 0.1
for epoch in range(10000):
    # Forward pass
    z1 = X.dot(W1) + b1
    a1 = sigmoid(z1)
    z2 = a1.dot(W2) + b2
    a2 = sigmoid(z2)

    # Backpropagation
    error = y - a2
    d2 = error * a2 * (1 - a2)
    d1 = d2.dot(W2.T) * a1 * (1 - a1)

    # Parameter updates
    W2 += a1.T.dot(d2) * lr
    b2 += np.sum(d2, axis=0, keepdims=True) * lr
    W1 += X.T.dot(d1) * lr
    b1 += np.sum(d1, axis=0, keepdims=True) * lr

print("Final predictions:")
print(a2.round())
Code language: PHP (php)

This demonstrates the core mechanics behind neural networks: forward passes, backpropagation, and gradient-based learning.


Modern Example: Neural Network in PyTorch

A more practical and scalable implementation using PyTorch:

import torch
import torch.nn as nn
import torch.optim as optim

# XOR data
X = torch.tensor([[0,0],[0,1],[1,0],[1,1]], dtype=torch.float32)
y = torch.tensor([[0],[1],[1],[0]], dtype=torch.float32)

class XORNet(nn.Module):
    def __init__(self):
        super(XORNet, self).__init__()
        self.hidden = nn.Linear(2, 2)
        self.output = nn.Linear(2, 1)
        self.sigmoid = nn.Sigmoid()
    
    def forward(self, x):
        x = self.sigmoid(self.hidden(x))
        x = self.sigmoid(self.output(x))
        return x

model = XORNet()
criterion = nn.MSELoss()
optimizer = optim.SGD(model.parameters(), lr=0.1)

for epoch in range(10000):
    optimizer.zero_grad()
    outputs = model(X)
    loss = criterion(outputs, y)
    loss.backward()
    optimizer.step()

print("Final predictions:")
print(model(X).round().detach())

Real-World Applications of Neural Networks

Neural networks drive innovation across major industries:

Healthcare

  • Medical imaging diagnostics
  • Early detection of tumors, diabetic retinopathy, and rare diseases

Finance

  • Real-time fraud detection
  • Predictive risk modeling

Technology & Consumer Apps

  • Recommendation systems (Netflix, Amazon)
  • Virtual assistants (Siri, Alexa)
  • Automatic translation (Google Translate)

Robotics & Autonomous Systems

  • Industrial robot control
  • Decision-making in self-driving vehicles

Neural Networks Are Not Magic—They’re Mathematics

Neural networks allow machines to learn from data with unprecedented depth, forming the foundation of modern AI.
Understanding how they work means understanding the core engine behind today’s intelligent technologies.

The future of AI is here—fully connected, data-driven, and powered by neural networks.


Codemotion Collection Background
ai
Our team’s picks

Want to find more articles like this? Check out the ai collection, where you'll find a curated selection of fresh, new content just for you.

Share on:facebooktwitterlinkedinreddit

Tagged as:AI

Diego Petrecolla
Beyond the Black Box: A Practical Guide to XAI for Developers
Previous Post

Footer

Discover

  • Events
  • Community
  • Partners
  • Become a partner
  • Hackathons

Magazine

  • Tech articles

Talent

  • Discover talent
  • Jobs

Companies

  • Discover companies

For Business

  • Codemotion for companies

About

  • About us
  • Become a contributor
  • Work with us
  • Contact us

Follow Us

© Copyright Codemotion srl Via Marsala, 29/H, 00185 Roma P.IVA 12392791005 | Privacy policy | Terms and conditions