{"id":35314,"date":"2026-02-18T20:17:01","date_gmt":"2026-02-18T19:17:01","guid":{"rendered":"https:\/\/www.codemotion.com\/magazine\/?p=35314"},"modified":"2026-02-18T20:17:03","modified_gmt":"2026-02-18T19:17:03","slug":"the-perceptron-the-first-building-block-of-neural-networks","status":"publish","type":"post","link":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/","title":{"rendered":"The Perceptron: The First Building Block of Neural Networks"},"content":{"rendered":"\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p>Imagine a miniature brain\u2014a digital cell that can learn to make simple decisions. That&#8217;s essentially what a perceptron is: the forgotten hero that laid the foundation for the powerful neural networks driving today&#8217;s AI. Don&#8217;t worry about the jargon; we&#8217;ll break it down piece by piece until it clicks.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-a-real-machine-that-changed-everything\">A Real Machine That Changed Everything<\/h2>\n\n\n\n<p>The perceptron isn&#8217;t just a theoretical concept\u2014it&#8217;s a piece of computing history. In <strong>1958<\/strong>, psychologist <strong>Frank Rosenblatt<\/strong> at Cornell Aeronautical Laboratory built the <strong>Mark I Perceptron<\/strong>, an actual physical machine that could learn to recognize simple patterns in 20\u00d720 pixel images using photocells, potentiometers, and electric motors to implement the algorithm in hardware.<\/p>\n\n\n\n<p>Rosenblatt was bold in his claims, predicting machines that would &#8220;walk, talk, see, write, reproduce itself and be conscious of its existence.&#8221; The hype was real\u2014until 1969, when Marvin Minsky and Seymour Papert published <em>Perceptrons<\/em>, mathematically proving that a single-layer perceptron <strong>cannot solve the XOR problem<\/strong>\u2014a simple logical operation. This limitation killed neural network funding for nearly two decades, triggering the first &#8220;AI winter.&#8221;<\/p>\n\n\n\n<p>The story has a happy ending: in the 1980s, <strong>multi-layer perceptrons (MLPs)<\/strong> with backpropagation solved the XOR limitation, reviving the field and leading directly to modern deep learning.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-from-biological-neurons-to-artificial-ones\">From Biological Neurons to Artificial Ones<\/h2>\n\n\n\n<p>Think about your brain. It&#8217;s made up of billions of interconnected neurons communicating with each other. When you see a dog, specific neurons fire in a particular pattern, allowing you to recognize it. Artificial neural networks simplify this biological process. They&#8217;re computational systems inspired by the brain that can learn from data, identify patterns, and make decisions.<\/p>\n\n\n\n<p><strong>The perceptron is the simplest neuron in these networks.<\/strong> It&#8217;s the fundamental unit that receives information, processes it, and outputs a signal. It&#8217;s an artificial neuron that takes multiple inputs, weights them, and produces a binary output (0 or 1).<\/p>\n\n\n\n<p>Before diving into the perceptron itself, let&#8217;s cover the mathematical tools that make it work: vectors, matrices, and the dot product.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-understanding-vectors\">Understanding Vectors<\/h2>\n\n\n\n<p>A vector is simply an ordered list of numbers. Think of it as an arrow in space pointing to a specific position.<\/p>\n\n\n\n<p><strong>Example:<\/strong> <code>v = [2, -1, 4]<\/code><\/p>\n\n\n\n<p>In our perceptron:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Our inputs form a vector: <code>X = [x\u2081, x\u2082, x\u2083]<\/code><\/li>\n\n\n\n<li>Our weights form another vector: <code>W = [w\u2081, w\u2082, w\u2083]<\/code><\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-perceptron-components-inputs-weights-and-output\">Perceptron Components: Inputs, Weights, and Output<\/h2>\n\n\n\n<p>Imagine the perceptron as an intelligent scale.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-inputs\">Inputs<\/h3>\n\n\n\n<p>These are the data we feed the perceptron. Think of them as features or &#8220;clues&#8221; about something. For example, if we want our perceptron to decide whether a fruit is an apple, the inputs might be:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Shape (is it round?)<\/li>\n\n\n\n<li>Color (is it red or green?)<\/li>\n\n\n\n<li>Size (small\/medium?)<\/li>\n<\/ul>\n\n\n\n<p>Each input is a number. We represent them as: <code>[x\u2081, x\u2082, x\u2083, ..., x\u2099]<\/code><\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-weights\">Weights<\/h3>\n\n\n\n<p>This is where the perceptron&#8217;s &#8220;intelligence&#8221; begins to emerge. Each input has an associated weight. Think of these as the &#8220;importance&#8221; the perceptron assigns to each clue.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>If red color is very important for identifying an apple, the weight for &#8220;red color&#8221; will be high<\/li>\n\n\n\n<li>If size isn&#8217;t as important, its weight will be lower<\/li>\n<\/ul>\n\n\n\n<p>These weights are numbers that the perceptron learns and adjusts over time. We represent them as: <code>[w\u2081, w\u2082, w\u2083, ..., w\u2099]<\/code><\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-bias\">Bias<\/h3>\n\n\n\n<p>Think of this as an additional &#8220;threshold&#8221; or &#8220;nudge&#8221; the perceptron can have. It helps adjust the output even when all inputs are zero. It&#8217;s like a default value that&#8217;s always there, regardless of inputs.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-understanding-matrices\">Understanding Matrices<\/h2>\n\n\n\n<p>A matrix is a rectangular collection of numbers organized in rows and columns\u2014essentially a data table or multiple stacked vectors.<\/p>\n\n\n\n<p><strong>Example:<\/strong><\/p>\n\n\n<pre class=\"wp-block-code\"><span><code class=\"hljs\">M = &#91;&#91;2, -1, 4],\n     &#91;1,  0, 3],\n     &#91;5,  2, 1]]\n<\/code><\/span><\/pre>\n\n\n<p>In multi-layer neural networks, we can represent all weights between one layer and the next as a matrix. This enables highly efficient computations using matrix multiplication\u2014a generalization of the dot product. Each column of the weight matrix could represent the weights for a specific perceptron in the next layer.<\/p>\n\n\n\n<p><strong>What are they used for?<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Representing input data (vectors)<\/li>\n\n\n\n<li>Weighting those inputs (weight vector)<\/li>\n\n\n\n<li>Performing operations across many neurons simultaneously (weight matrices)<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-the-perceptron-s-internal-process\">The Perceptron&#8217;s Internal Process<\/h2>\n\n\n\n<p>Now let&#8217;s see how the perceptron makes a decision. The heart of the perceptron is a simple but powerful operation:<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-step-1-multiply-and-sum-the-dot-product\">Step 1: Multiply and Sum (The Dot Product)<\/h3>\n\n\n\n<p>The perceptron takes each input (x\u1d62) and multiplies it by its corresponding weight (w\u1d62). Then it sums all these products and adds the bias (b).<\/p>\n\n\n\n<p>With two inputs (x\u2081, x\u2082), their corresponding weights (w\u2081, w\u2082), and a bias (b), the calculation would be:<\/p>\n\n\n<pre class=\"wp-block-code\"><span><code class=\"hljs\">Result = (x\u2081 \u00d7 w\u2081) + (x\u2082 \u00d7 w\u2082) + b\n<\/code><\/span><\/pre>\n\n\n<p>This step is crucial, and this is where vectors and the dot product come into play.<\/p>\n\n\n\n<p><strong>The Dot Product<\/strong> (or scalar product) between two vectors v and w of the same length is the sum of the products of their components:<\/p>\n\n\n<pre class=\"wp-block-code\"><span><code class=\"hljs\">v \u00b7 w = (v\u2081 \u00d7 w\u2081) + (v\u2082 \u00d7 w\u2082) + ... + (v\u2099 \u00d7 w\u2099)\n<\/code><\/span><\/pre>\n\n\n<p><strong>Example:<\/strong><\/p>\n\n\n<pre class=\"wp-block-code\"><span><code class=\"hljs\">v = &#91;2, -1, 4]\nw = &#91;1,  3, 0]\nv \u00b7 w = (2\u00d71) + (-1\u00d73) + (4\u00d70) = 2 - 3 + 0 = -1\n<\/code><\/span><\/pre>\n\n\n<p>In the perceptron, this value determines whether the neuron &#8220;fires&#8221; (1) or not (0).<\/p>\n\n\n\n<p>In other words, the dot product is a mathematical operation between two vectors that produces a single number. This is exactly what we do in Step 1:<\/p>\n\n\n<pre class=\"wp-block-code\"><span><code class=\"hljs\">Result = (X \u00b7 W) + b\n<\/code><\/span><\/pre>\n\n\n<h3 class=\"wp-block-heading\" id=\"h-step-2-the-decision-activation-function\">Step 2: The Decision (Activation Function)<\/h3>\n\n\n\n<p>Once the perceptron has combined its inputs with the weights and bias, the result passes through an &#8220;activation function.&#8221; This function acts like a switch. For the original perceptron, this function was very simple:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>If the result exceeds a certain threshold (e.g., zero), the perceptron &#8220;activates&#8221; and outputs a 1<\/li>\n\n\n\n<li>If it doesn&#8217;t exceed it, it outputs a 0<\/li>\n<\/ul>\n\n\n\n<p>For the original perceptron, this was a step or threshold function:<\/p>\n\n\n<pre class=\"wp-block-code\"><span><code class=\"hljs\">If result \u2265 threshold (or result \u2265 0 if threshold is incorporated into bias), output is 1\nIf result &lt; threshold, output is 0\n<\/code><\/span><\/pre>\n\n\n<p>And that&#8217;s it! The perceptron has made a decision. It&#8217;s a binary decision: yes or no, 1 or 0.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-the-perceptron-step-by-step\">The Perceptron Step-by-Step<\/h2>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Inputs:<\/strong> <code>x = [x\u2081, x\u2082, \u2026, x\u2099]<\/code><\/li>\n\n\n\n<li><strong>Weights:<\/strong> <code>w = [w\u2081, w\u2082, \u2026, w\u2099]<\/code><\/li>\n\n\n\n<li><strong>Bias:<\/strong> a number <code>b<\/code> that shifts the decision threshold<\/li>\n\n\n\n<li><strong>Internal calculation:<\/strong> <code>z = dot_product(w, x) + b<\/code><\/li>\n\n\n\n<li><strong>Activation function (step):<\/strong> <code>y = 1 if z \u2265 0, y = 0 if z &lt; 0<\/code><\/li>\n<\/ol>\n\n\n\n<p>This simple scheme learns to classify linearly separable data\u2014meaning data that can be divided by a straight line (or hyperplane in higher dimensions).<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-why-this-matters-for-modern-developers\">Why This Matters for Modern Developers<\/h2>\n\n\n\n<p>The perceptron may seem primitive by today&#8217;s standards, but it introduced fundamental concepts that power modern deep learning:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Weighted inputs:<\/strong> Still the core of every neural network<\/li>\n\n\n\n<li><strong>Learning through weight adjustment:<\/strong> The basis of backpropagation<\/li>\n\n\n\n<li><strong>Linear separation:<\/strong> Understanding its limitations led to multi-layer networks<\/li>\n<\/ul>\n\n\n\n<p>While a single perceptron can only solve linearly separable problems\u2014the limitation that caused the first &#8220;AI winter&#8221;\u2014stacking multiple perceptrons into layers creates the deep neural networks that now power everything from image recognition to large language models.<\/p>\n\n\n\n<p>Today, the perceptron algorithm is:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Taught in every ML course as the foundational building block<\/li>\n\n\n\n<li>Used in production as part of larger neural networks<\/li>\n\n\n\n<li>The conceptual ancestor of every modern deep learning model<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p><strong>Key takeaway:<\/strong> The perceptron is where it all began. Understanding this simple unit\u2014inputs, weights, dot products, and activation functions\u2014gives you the mental model to understand modern neural architectures, from CNNs to Transformers.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Imagine a miniature brain\u2014a digital cell that can learn to make simple decisions. That&#8217;s essentially what a perceptron is: the forgotten hero that laid the foundation for the powerful neural networks driving today&#8217;s AI. Don&#8217;t worry about the jargon; we&#8217;ll break it down piece by piece until it clicks. A Real Machine That Changed Everything&#8230; <a class=\"more-link\" href=\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/\">Read more<\/a><\/p>\n","protected":false},"author":64,"featured_media":33598,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_editorskit_title_hidden":false,"_editorskit_reading_time":0,"_editorskit_is_block_options_detached":false,"_editorskit_block_options_position":"{}","_uag_custom_page_level_css":"","_genesis_hide_title":false,"_genesis_hide_breadcrumbs":false,"_genesis_hide_singular_image":false,"_genesis_hide_footer_widgets":false,"_genesis_custom_body_class":"","_genesis_custom_post_class":"","_genesis_layout":"","footnotes":""},"categories":[47],"tags":[10003,13836,13838],"collections":[11387],"class_list":{"0":"post-35314","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-deep-learning","8":"tag-ai","9":"tag-perceptron","10":"tag-rosenblatt","11":"collections-top-of-the-week","12":"entry"},"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v26.9 (Yoast SEO v26.9) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>The Perceptron: The Foundation of Neural Architectures<\/title>\n<meta name=\"description\" content=\"The perceptron is where it all began. Understanding this simple unit\u2014inputs, weights, dot products, and activation functions\u2014gives you the mental model to understand modern neural architectures, from CNNs to Transformers.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"The Perceptron: The First Building Block of Neural Networks\" \/>\n<meta property=\"og:description\" content=\"The perceptron is where it all began. Understanding this simple unit\u2014inputs, weights, dot products, and activation functions\u2014gives you the mental model to understand modern neural architectures, from CNNs to Transformers.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/\" \/>\n<meta property=\"og:site_name\" content=\"Codemotion Magazine\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/Codemotion.Italy\/\" \/>\n<meta property=\"article:published_time\" content=\"2026-02-18T19:17:01+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-02-18T19:17:03+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1280\" \/>\n\t<meta property=\"og:image:height\" content=\"720\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Codemotion\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@CodemotionIT\" \/>\n<meta name=\"twitter:site\" content=\"@CodemotionIT\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Codemotion\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/\"},\"author\":{\"name\":\"Codemotion\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/201bb98b02412383686cced7521b861c\"},\"headline\":\"The Perceptron: The First Building Block of Neural Networks\",\"datePublished\":\"2026-02-18T19:17:01+00:00\",\"dateModified\":\"2026-02-18T19:17:03+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/\"},\"wordCount\":1089,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg\",\"keywords\":[\"AI\",\"perceptron\",\"rosenblatt\"],\"articleSection\":[\"Deep Learning\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/\",\"name\":\"The Perceptron: The Foundation of Neural Architectures\",\"isPartOf\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg\",\"datePublished\":\"2026-02-18T19:17:01+00:00\",\"dateModified\":\"2026-02-18T19:17:03+00:00\",\"description\":\"The perceptron is where it all began. Understanding this simple unit\u2014inputs, weights, dot products, and activation functions\u2014gives you the mental model to understand modern neural architectures, from CNNs to Transformers.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#primaryimage\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg\",\"contentUrl\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg\",\"width\":1280,\"height\":720},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.codemotion.com\/magazine\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI\/ML\",\"item\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Deep Learning\",\"item\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"The Perceptron: The First Building Block of Neural Networks\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#website\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/\",\"name\":\"Codemotion Magazine\",\"description\":\"We code the future. Together\",\"publisher\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.codemotion.com\/magazine\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#organization\",\"name\":\"Codemotion\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/codemotionlogo.png\",\"contentUrl\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/codemotionlogo.png\",\"width\":225,\"height\":225,\"caption\":\"Codemotion\"},\"image\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/Codemotion.Italy\/\",\"https:\/\/x.com\/CodemotionIT\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/201bb98b02412383686cced7521b861c\",\"name\":\"Codemotion\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/cropped-codemotionlogo-150x150.png\",\"contentUrl\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/cropped-codemotionlogo-150x150.png\",\"caption\":\"Codemotion\"},\"description\":\"Articles wirtten by the Codemotion staff. Tech news, inspiration, latest treends in software development and more.\",\"sameAs\":[\"https:\/\/x.com\/CodemotionIT\"],\"url\":\"https:\/\/www.codemotion.com\/magazine\/author\/codemotion-2\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"The Perceptron: The Foundation of Neural Architectures","description":"The perceptron is where it all began. Understanding this simple unit\u2014inputs, weights, dot products, and activation functions\u2014gives you the mental model to understand modern neural architectures, from CNNs to Transformers.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/","og_locale":"en_US","og_type":"article","og_title":"The Perceptron: The First Building Block of Neural Networks","og_description":"The perceptron is where it all began. Understanding this simple unit\u2014inputs, weights, dot products, and activation functions\u2014gives you the mental model to understand modern neural architectures, from CNNs to Transformers.","og_url":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/","og_site_name":"Codemotion Magazine","article_publisher":"https:\/\/www.facebook.com\/Codemotion.Italy\/","article_published_time":"2026-02-18T19:17:01+00:00","article_modified_time":"2026-02-18T19:17:03+00:00","og_image":[{"width":1280,"height":720,"url":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg","type":"image\/jpeg"}],"author":"Codemotion","twitter_card":"summary_large_image","twitter_creator":"@CodemotionIT","twitter_site":"@CodemotionIT","twitter_misc":{"Written by":"Codemotion","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#article","isPartOf":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/"},"author":{"name":"Codemotion","@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/201bb98b02412383686cced7521b861c"},"headline":"The Perceptron: The First Building Block of Neural Networks","datePublished":"2026-02-18T19:17:01+00:00","dateModified":"2026-02-18T19:17:03+00:00","mainEntityOfPage":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/"},"wordCount":1089,"commentCount":0,"publisher":{"@id":"https:\/\/www.codemotion.com\/magazine\/#organization"},"image":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#primaryimage"},"thumbnailUrl":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg","keywords":["AI","perceptron","rosenblatt"],"articleSection":["Deep Learning"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/","url":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/","name":"The Perceptron: The Foundation of Neural Architectures","isPartOf":{"@id":"https:\/\/www.codemotion.com\/magazine\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#primaryimage"},"image":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#primaryimage"},"thumbnailUrl":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg","datePublished":"2026-02-18T19:17:01+00:00","dateModified":"2026-02-18T19:17:03+00:00","description":"The perceptron is where it all began. Understanding this simple unit\u2014inputs, weights, dot products, and activation functions\u2014gives you the mental model to understand modern neural architectures, from CNNs to Transformers.","breadcrumb":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#primaryimage","url":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg","contentUrl":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg","width":1280,"height":720},{"@type":"BreadcrumbList","@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/the-perceptron-the-first-building-block-of-neural-networks\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.codemotion.com\/magazine\/"},{"@type":"ListItem","position":2,"name":"AI\/ML","item":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/"},{"@type":"ListItem","position":3,"name":"Deep Learning","item":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/deep-learning\/"},{"@type":"ListItem","position":4,"name":"The Perceptron: The First Building Block of Neural Networks"}]},{"@type":"WebSite","@id":"https:\/\/www.codemotion.com\/magazine\/#website","url":"https:\/\/www.codemotion.com\/magazine\/","name":"Codemotion Magazine","description":"We code the future. Together","publisher":{"@id":"https:\/\/www.codemotion.com\/magazine\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.codemotion.com\/magazine\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.codemotion.com\/magazine\/#organization","name":"Codemotion","url":"https:\/\/www.codemotion.com\/magazine\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/logo\/image\/","url":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/codemotionlogo.png","contentUrl":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/codemotionlogo.png","width":225,"height":225,"caption":"Codemotion"},"image":{"@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/Codemotion.Italy\/","https:\/\/x.com\/CodemotionIT"]},{"@type":"Person","@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/201bb98b02412383686cced7521b861c","name":"Codemotion","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/image\/","url":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/cropped-codemotionlogo-150x150.png","contentUrl":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/cropped-codemotionlogo-150x150.png","caption":"Codemotion"},"description":"Articles wirtten by the Codemotion staff. Tech news, inspiration, latest treends in software development and more.","sameAs":["https:\/\/x.com\/CodemotionIT"],"url":"https:\/\/www.codemotion.com\/magazine\/author\/codemotion-2\/"}]}},"featured_image_src":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-600x400.jpg","featured_image_src_square":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-600x600.jpg","author_info":{"display_name":"Codemotion","author_link":"https:\/\/www.codemotion.com\/magazine\/author\/codemotion-2\/"},"uagb_featured_image_src":{"full":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg",1280,720,false],"thumbnail":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-150x150.jpg",150,150,true],"medium":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-300x169.jpg",300,169,true],"medium_large":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-768x432.jpg",768,432,true],"large":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-1024x576.jpg",1024,576,true],"1536x1536":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg",1280,720,false],"2048x2048":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales.jpg",1280,720,false],"small-home-featured":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-100x100.jpg",100,100,true],"sidebar-featured":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-180x128.jpg",180,128,true],"genesis-singular-images":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-896x504.jpg",896,504,true],"archive-featured":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-400x225.jpg",400,225,true],"gb-block-post-grid-landscape":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-600x400.jpg",600,400,true],"gb-block-post-grid-square":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2025\/07\/redes-neuronales-600x600.jpg",600,600,true]},"uagb_author_info":{"display_name":"Codemotion","author_link":"https:\/\/www.codemotion.com\/magazine\/author\/codemotion-2\/"},"uagb_comment_info":0,"uagb_excerpt":"Imagine a miniature brain\u2014a digital cell that can learn to make simple decisions. That&#8217;s essentially what a perceptron is: the forgotten hero that laid the foundation for the powerful neural networks driving today&#8217;s AI. Don&#8217;t worry about the jargon; we&#8217;ll break it down piece by piece until it clicks. A Real Machine That Changed Everything&#8230;&hellip;","lang":"en","_links":{"self":[{"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/posts\/35314","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/users\/64"}],"replies":[{"embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/comments?post=35314"}],"version-history":[{"count":1,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/posts\/35314\/revisions"}],"predecessor-version":[{"id":35315,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/posts\/35314\/revisions\/35315"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/media\/33598"}],"wp:attachment":[{"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/media?parent=35314"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/categories?post=35314"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/tags?post=35314"},{"taxonomy":"collections","embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/collections?post=35314"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}