{"id":8576,"date":"2020-09-07T11:34:33","date_gmt":"2020-09-07T09:34:33","guid":{"rendered":"https:\/\/www.codemotion.com\/magazine\/?p=8576"},"modified":"2022-01-05T20:05:53","modified_gmt":"2022-01-05T19:05:53","slug":"machine-learning-edge-example","status":"publish","type":"post","link":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/","title":{"rendered":"ML at the Edge: a Practical Example"},"content":{"rendered":"\t\t\t\t<div class=\"wp-block-uagb-table-of-contents uagb-toc__align-left uagb-toc__columns-1  uagb-block-7c1ce160      \"\n\t\t\t\t\tdata-scroll= \"1\"\n\t\t\t\t\tdata-offset= \"30\"\n\t\t\t\t\tstyle=\"\"\n\t\t\t\t>\n\t\t\t\t<div class=\"uagb-toc__wrap\">\n\t\t\t\t\t\t<div class=\"uagb-toc__title\">\n\t\t\t\t\t\t\tTable Of Contents\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<div class=\"uagb-toc__list-wrap \">\n\t\t\t\t\t\t<ol class=\"uagb-toc__list\"><li class=\"uagb-toc__list\"><a href=\"#gesture-recognition\" class=\"uagb-toc-link__trigger\">Gesture recognition<\/a><li class=\"uagb-toc__list\"><a href=\"#techniques-for-gesture-recognition\" class=\"uagb-toc-link__trigger\">Techniques for gesture recognition<\/a><li class=\"uagb-toc__list\"><a href=\"#machine-learning-for-gesture-recognition\" class=\"uagb-toc-link__trigger\">Machine learning for gesture recognition<\/a><li class=\"uagb-toc__list\"><a href=\"#a-practical-example-of-edge-ml\" class=\"uagb-toc-link__trigger\">A practical example of edge ML<\/a><li class=\"uagb-toc__list\"><a href=\"#running-the-model\" class=\"uagb-toc-link__trigger\">Running the model<\/a><li class=\"uagb-toc__list\"><a href=\"#conclusions-and-what-next\" class=\"uagb-toc-link__trigger\">Conclusions and what next<\/a><\/ol>\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\n\n\n<p><span id=\"urn:enhancement-4705bb2e\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/machine_learning_2\">Machine learning<\/span> is the primary <span id=\"urn:enhancement-c0485a1f\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/methodology\">methodology<\/span> for delivering <span id=\"urn:enhancement-b4782433\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/applications_of_artificial_intelligence\">AI applications<\/span>. In previous <span id=\"urn:enhancement-22b9c3b2\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/article_publishing\">articles<\/span>, I discussed <a href=\"https:\/\/www.codemotion.com\/magazine\/dev-hub\/machine-learning-dev\/the-cutting-edge-of-real-time-ai\/\">the main reasons behind moving machine learning to the network edge<\/a>. These include the need for real-time <span id=\"urn:enhancement-2b8e403f\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/performance\">performance<\/span>, <span id=\"urn:enhancement-3c3112f6\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/security\">security<\/span> considerations, and a lack of connectivity. However, ML models typically require powerful <span id=\"urn:enhancement-a2d77aae\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/computer_hardware\">hardware<\/span> to run efficiently. Fortunately, in recent years, MCUs have become ever more powerful, alongside the release of new frameworks enabling edge ML. TensorFlow Lite, in particular, has gained significant traction.<\/p>\n\n\n\n<p>The <a href=\"https:\/\/www.codemotion.com\/magazine\/dev-hub\/machine-learning-dev\/edge-machine-learning\/\">last article<\/a> discussed choosing a suitable platform for your edge ML <span id=\"urn:enhancement-e5182cd4\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/project\">project<\/span>. It also explored some of the limitations imposed by <strong>TensorFlow Lite<\/strong>. In this <span id=\"urn:enhancement-e60aab68\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/article_publishing\">article<\/span>, we look at a practical example of running a TensorFlow Lite <span id=\"urn:enhancement-73bf46ac\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span> on an <a aria-label=\"undefined (opens in a new tab)\" href=\"https:\/\/www.mouser.com\/new\/nxp-semiconductors\/nxp-mimxrt1050-evk\/\" target=\"_blank\" rel=\"noreferrer noopener\">NXP i.MX RT1050 EVK<\/a>. Specifically, I\u2019ll show how you can perform <span id=\"urn:enhancement-3353524d\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span> recognition using the built-in accelerometers and a simple neural network.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Gesture recognition<\/h2>\n\n\n\n<p><span id=\"urn:enhancement-3a7c59fb\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture_recognition\">Gesture recognition<\/span> is a growing field in human-computer <span id=\"urn:enhancement-fa933873\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/interaction\">interaction<\/span> using a combination of <span id=\"urn:enhancement-c302ebe8\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/sensor\">sensors<\/span> and algorithms to identify specific human gestures. It enables many <span id=\"urn:enhancement-47abb12c\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/human-computer_interaction_2\">HCI<\/span> <span id=\"urn:enhancement-b4c4395e\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/application_software\">applications<\/span> ranging from controlling your car stereo to <span id=\"urn:enhancement-d849b143\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/interactivity\">interactive<\/span> <span id=\"urn:enhancement-6743ae1\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/video_game\">gaming<\/span>. It can even <a href=\"http:\/\/bkveton.com\/docs\/pervasive2009.pdf\" target=\"_blank\" aria-label=\"undefined (opens in a new tab)\" rel=\"noreferrer noopener\">identify individuals<\/a> based on the different ways they hold and use a <span id=\"urn:enhancement-2dc63bc\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/computer_hardware\">device<\/span>.&nbsp;<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Applications of gesture recognition<\/h3>\n\n\n\n<p>One of the earliest well-known <span id=\"urn:enhancement-5bedd251\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/application_software\">applications<\/span> of <span id=\"urn:enhancement-788bae5\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span> recognition was the <span id=\"urn:enhancement-86f8959f\" class=\"textannotation disambiguated wl-organization\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/nintendo\">Nintendo<\/span> <span id=\"urn:enhancement-7160587\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/wii\">Wii<\/span>. <span id=\"urn:enhancement-5340fd17\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/wii\">The Wii<\/span> used <span id=\"urn:enhancement-537cd229\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/handheld_game_console\">hand-held<\/span> accelerometers enabling players to interact with various virtual sports events. It was only able to detect large gestures, but that was ideal for playing virtual sports like tennis or golf.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img decoding=\"async\" src=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image3-1024x695.png\" alt=\"\" class=\"wp-image-8578\"\/><figcaption>The Nintendo Wii offered simple controllers with accelerometers to enable gesture recognition<\/figcaption><\/figure><\/div>\n\n\n\n<p>More recently, we have seen a rush to install gesture recognition as a safety feature in cars. Many motoring accidents are caused by the driver being distracted, one of the most frequent distractions being the audio <span id=\"urn:enhancement-4cc43e77\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/system\">system<\/span>. <span id=\"urn:enhancement-b5779f47\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture_recognition\">Gesture recognition<\/span> allows the driver to control the <span id=\"urn:enhancement-af27f465\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/system\">system<\/span> without taking their eyes off the road: waving in a specific <span id=\"urn:enhancement-6b356244\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/pattern\">pattern<\/span> to skip a track, increase volume, or change channel.&nbsp;<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Techniques for gesture recognition<\/h2>\n\n\n\n<p>There are three broad forms of <span id=\"urn:enhancement-564b9566\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span> recognition. The first relies on detecting movement in a <span id=\"urn:enhancement-d145a9cd\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/sensor\">sensor<\/span>\u2019s <span id=\"urn:enhancement-50459e8b\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/field_of_view\">field of view<\/span>. The second uses <span id=\"urn:enhancement-b60f2d72\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/accelerometer\">accelerometers<\/span> within a <span id=\"urn:enhancement-d969d7a6\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/computer_hardware\">device<\/span> being held or worn by the <span id=\"urn:enhancement-a85f940e\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/user_computing\">user<\/span>. The third uses computer vision to process the input from a camera.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">IR sensors<\/h3>\n\n\n\n<p><span id=\"urn:enhancement-a52a09df\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/infrared\">IR<\/span> movement detection has existed for decades and is still widely used in burglar alarm systems, for example. However, this sort of movement detection is binary. That is to say, either it detects movement or it doesn\u2019t, which is not sufficient to enable <span id=\"urn:enhancement-b3ee1c6a\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span> recognition. Instead, a <a href=\"https:\/\/eu.mouser.com\/new\/terabee\/terabee-teraranger-evo-64px-sensors\/\" target=\"_blank\" aria-label=\"undefined (opens in a new tab)\" rel=\"noreferrer noopener\">time of flight sensor<\/a> is required. These send out pulses of <span id=\"urn:enhancement-45e571e3\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/infrared\">IR<\/span> energy and measure the time for the reflected pulse to return, thus identifying simple gestures.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Accelerometers<\/h3>\n\n\n\n<p>Our world is full of <span id=\"urn:enhancement-81e2df17\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/accelerometer\">accelerometers<\/span>: in <span id=\"urn:enhancement-10503ef7\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/mobile_computing\">mobile<\/span> phones, in <span id=\"urn:enhancement-7952c431\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/video_game\">video game<\/span> <span id=\"urn:enhancement-a04ee8bc\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/game_controller\">controllers<\/span>, even in <span id=\"urn:enhancement-43131f7a\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/remote_control\">TV remote<\/span> controls. Modern <span id=\"urn:enhancement-79ca5062\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/accelerometer\">accelerometers<\/span> are typically 6 axis, and often incorporate other <span id=\"urn:enhancement-f5778345\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/sensor\">sensors<\/span>, providing <span id=\"urn:enhancement-31d60ae9\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/data\">data<\/span> on movement in the XYZ directions, along with pitch, yaw, and roll. All of which can be used to determine <span id=\"urn:enhancement-67315ca1\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gestures<\/span>.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"249\" height=\"184\" src=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image2.png\" alt=\"\" class=\"wp-image-8579\"\/><\/figure><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">Computer vision<\/h3>\n\n\n\n<p>The most powerful approach for <span id=\"urn:enhancement-a8a60a3f\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span> recognition involves the <span id=\"urn:enhancement-3b731faf\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/application_software\">application<\/span> of computer vision. Here, you train a set of ML models to identify a hand in a video, track its movement, and recognise <span id=\"urn:enhancement-f13b3da\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gestures<\/span>. This approach has two key advantages over the other techniques: it doesn\u2019t require the <span id=\"urn:enhancement-2a3fccff\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/user_computing\">user<\/span> to be holding a physical object (as is the case with <span id=\"urn:enhancement-64428295\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/accelerometer\">accelerometers<\/span>), and it can work at much longer ranges than <span id=\"urn:enhancement-a225bf8c\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/system\">systems<\/span> based on time of flight <span id=\"urn:enhancement-8d512ed6\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/sensor\">sensors<\/span>.&nbsp;<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Time of flight cameras<\/h3>\n\n\n\n<p>One of the newest approaches for <span id=\"urn:enhancement-c6147e59\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span> recognition takes advantage of <a href=\"https:\/\/eu.mouser.com\/new\/terabee\/terabee-3dcam-camera\/\" target=\"_blank\" aria-label=\"undefined (opens in a new tab)\" rel=\"noreferrer noopener\">time-of-flight (ToF) cameras.<\/a> These <span id=\"urn:enhancement-ca2c5f78\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/computer_hardware\">devices<\/span> combine a camera with the ability to measure the distance to an object. They provide the ability to create 3D <span id=\"urn:enhancement-f3776037\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/scientific_modelling\">models<\/span> of the environment without the need for expensive LIDAR scanning technology. As a result, they are perfect for recognising complex <span id=\"urn:enhancement-fc3f2df1\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gestures<\/span> at a distance.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Machine learning for gesture recognition<\/h2>\n\n\n\n<p><span id=\"urn:enhancement-6abc8419\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/machine_learning_2\">Machine learning<\/span> is one of the most flexible techniques for enabling <span id=\"urn:enhancement-2574a2a4\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span> recognition. It allows the identification of complex <span id=\"urn:enhancement-8e968960\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gestures<\/span> yet is agnostic to the person making the <span id=\"urn:enhancement-66482e5d\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span>. ML models can take inputs from multiple different <span id=\"urn:enhancement-172e64d2\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/sensor\">sensors<\/span> and can cope with noisy data, outperforming the purely algorithmic approach taken by <span id=\"urn:enhancement-60afe93f\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/system\">systems<\/span> like <span id=\"urn:enhancement-34e782c0\" class=\"textannotation disambiguated wl-organization\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/nintendo\">Nintendo<\/span> <span id=\"urn:enhancement-2bae4e85\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/wii\">Wii<\/span> or in-car audio. So, how do you create a <span id=\"urn:local-annotation-590867\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/machine_learning_2\">machine learning<\/span> <span id=\"urn:enhancement-efd610e\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span> for <span id=\"urn:enhancement-290586bc\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span> recognition and port it to your edge device?<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">A practical example of edge ML<\/h2>\n\n\n\n<p><a href=\"https:\/\/www.codemotion.com\/magazine\/dev-hub\/machine-learning-dev\/edge-machine-learning\/\">The previous article<\/a> explained the steps needed to create a TensorFlow Lite <span id=\"urn:enhancement-e51d4765\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span> and port it to a <span id=\"urn:enhancement-d5753e6d\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/computer_hardware\">device<\/span>. Now let\u2019s see what this looks like in practice using a simple <span id=\"urn:enhancement-5f7657f8\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gesture<\/span> recognition <span id=\"urn:enhancement-979c87bd\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span>.&nbsp;<\/p>\n\n\n\n<p>This example uses the&nbsp; available from&nbsp; and one of the most powerful <span id=\"urn:enhancement-f48f61e4\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/microcontroller\">MCU<\/span> boards in its <span id=\"urn:enhancement-74819e4a\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/class_computer_programming\">class<\/span>. It is based on an <span id=\"urn:enhancement-504152eb\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/arm_architecture\">ARM<\/span> Cortex 7 processor running at up to 600MHz. The board <span id=\"urn:enhancement-f40bda4f\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/specification_technical_standard\">specification<\/span> includes:<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li><strong>Processor<\/strong>: MIMXRT1052DVL6A processor clocked at up to 600MHz<\/li><li><strong>Memory<\/strong>:<ul><li>256 Mb SDRAM memory<\/li><li>512 Mb Hyper Flash<\/li><li>64 Mb QSPI Flash<\/li><li>Footprint for QSPI Flash<\/li><li>TF socket for SD card<\/li><\/ul><\/li><li><strong>Display and Audio<\/strong>:<ul><li>Parallel LCD connector<\/li><li>Camera connector<\/li><li>Audio codec<\/li><li>4-pole audio headphone jack<\/li><li>External speaker connection<\/li><li>Microphone<\/li><li>S\/PDIF connector<\/li><\/ul><\/li><li><strong>Connectivity<\/strong>:<ul><li>Micro USB host and OTG connectors<\/li><li>Ethernet (10\/100T) connector<\/li><li>CAN transceivers<\/li><li>Arduino\u00ae interface<\/li><\/ul><\/li><li><strong>IoT sensors<\/strong>: FXOS8700CQ 3D Accelerometer and Magnetometer<\/li><\/ul>\n\n\n\n<p>This board is ideally suited for running sophisticated and powerful ML models at the edge. It is capable of handling large neural networks, making it suitable for <span id=\"urn:local-annotation-662468\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/image_recognition\">image recognition<\/span> tasks.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img decoding=\"async\" src=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image5-1024x708.png\" alt=\"\" class=\"wp-image-8580\"\/><figcaption>Top view of NXP iMX RT1050 evaluation kit<\/figcaption><\/figure><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">Selecting a model<\/h3>\n\n\n\n<p>You could choose to create a <span id=\"urn:enhancement-f68d482e\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span> from scratch. However, this is only worthwhile if you need to use a non-standard <span id=\"urn:enhancement-38217470\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/sensor\">sensor<\/span> or identify unusual <span id=\"urn:enhancement-14374331\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gestures<\/span>. Typically, it\u2019s easier to find an existing <span id=\"urn:enhancement-56e402b2\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span> and retrain it.&nbsp;An excellent starting point to find existing <span id=\"urn:enhancement-d3f21657\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/scientific_modelling\">models<\/span> is <a href=\"https:\/\/modelzoo.co\/\">Model Zoo<\/a> &#8211; a vast collection of ML <span id=\"urn:enhancement-9487bb6a\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/scientific_modelling\">models<\/span> you can use for free. However, in this case, we want to find a <span id=\"urn:enhancement-fe19a1e9\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span> that converts <span id=\"urn:enhancement-dc21693a\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/accelerometer\">accelerometer<\/span> readings into <span id=\"urn:enhancement-809146fe\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/gesture\">gestures<\/span>. Currently, there is no suitable <span id=\"urn:enhancement-4680c98c\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span> in Model Zoo. Instead, we will use one of the sample <span id=\"urn:enhancement-c9a0674b\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/scientific_modelling\">models<\/span> provided for TensorFlow Lite. This example is called <a href=\"https:\/\/github.com\/tensorflow\/tensorflow\/tree\/master\/tensorflow\/lite\/micro\/examples\/magic_wand\">Magic Wand<\/a> and is trained to recognise the three gestures shown below.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"587\" height=\"210\" src=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image1.png\" alt=\"\" class=\"wp-image-8581\" srcset=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image1.png 587w, https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image1-300x107.png 300w\" sizes=\"auto, (max-width: 587px) 100vw, 587px\" \/><\/figure><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">How to build your edge ML model<\/h3>\n\n\n\n<p>The first step is to set up your <span id=\"urn:enhancement-6c303dab\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/development_environment_software_development_process\">development environment<\/span>. I recommend using <span id=\"urn:enhancement-e5c09c72\" class=\"textannotation disambiguated wl-organization\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/nxp_semiconductors\">NXP<\/span>\u2019s <a href=\"https:\/\/www.nxp.com\/design\/software\/development-software\/mcuxpresso-software-and-tools-\/mcuxpresso-integrated-development-environment-ide:MCUXpresso-IDE\" target=\"_blank\" aria-label=\"undefined (opens in a new tab)\" rel=\"noreferrer noopener\">MCUXpresso<\/a>, which is based on <span id=\"urn:enhancement-d8840ff6\" class=\"textannotation disambiguated wl-creative-work\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/eclipse_software\">Eclipse<\/span>. You will also need to <a href=\"https:\/\/mcuxpresso.nxp.com\/en\/builder\" target=\"_blank\" aria-label=\"undefined (opens in a new tab)\" rel=\"noreferrer noopener\">build and download<\/a> a suitable <span id=\"urn:enhancement-d5707d7\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/software_development_kit\">SDK<\/span> for your development board. This requires you to create an <span id=\"urn:enhancement-ab98f9e6\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/user_computing\">account<\/span> on the <span id=\"urn:enhancement-7521fb97\" class=\"textannotation disambiguated wl-organization\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/nxp_semiconductors\">NXP<\/span> website.&nbsp;<\/p>\n\n\n\n<p>When you are creating your <span id=\"urn:enhancement-dfcba076\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/software_development_kit\">SDK<\/span>, make sure you remember to select the correct board and include the <strong>eIQ<\/strong>, <strong>ISSDK,<\/strong> and <strong>sdmmc<\/strong> <span id=\"urn:enhancement-8527310b\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/modular_programming\">modules<\/span>. These ensure your <span id=\"urn:enhancement-f70288a8\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/software_development_kit\">SDK<\/span> can read from the <span id=\"urn:enhancement-5c1d5b5f\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/accelerometer\">accelerometer<\/span>, provide native support for neural networks, and enable <span id=\"urn:enhancement-c58c5ec6\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/secure_digital\">SD card<\/span> <span id=\"urn:enhancement-e01c3ffe\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/input_output\">IO<\/span>.&nbsp;<\/p>\n\n\n\n<p>Now install your <span id=\"urn:enhancement-77020b4c\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/software_development_kit\">SDK<\/span> into MCUXpresso and create a new <span id=\"urn:enhancement-2145ec35\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/project\">project<\/span> called Magic Wand. Use git to clone the <a href=\"https:\/\/github.com\/tensorflow\/tensorflow\/tree\/master\/tensorflow\/lite\/micro\/examples\/magic_wand\" target=\"_blank\" aria-label=\"undefined (opens in a new tab)\" rel=\"noreferrer noopener\">Magic Wand<\/a> repository to the correct <span id=\"urn:enhancement-69341158\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/project\">project<\/span> folder on your local <span id=\"urn:enhancement-75cdd145\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/system\">system<\/span>.&nbsp;<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Converting the ML model to TensorFlow Lite<\/h3>\n\n\n\n<p>The current <span id=\"urn:enhancement-61046d1c\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span> is written for <span id=\"urn:enhancement-fd3e221c\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/tensorflow\">TensorFlow<\/span>. Using the <a href=\"https:\/\/www.tensorflow.org\/lite\/convert\">TensorFlow Lite Converter<\/a>, convert it to TensorFlow Lite FlatBuffer format&nbsp; &#8211;&nbsp; an efficient flat <span id=\"urn:enhancement-de04d1b1\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/structure\">structure<\/span> for storing hierarchical <span id=\"urn:enhancement-48f076f2\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/data\">data<\/span>.&nbsp;<\/p>\n\n\n\n<p>In brief, the steps needed are:<\/p>\n\n\n\n<ol class=\"wp-block-list\"><li>Using the Python API, convert the model to TensorFlow Lite FlatBuffer. You need to import TensorFlow and use the tf.lite.TFLiteConverter.from_saved_model() command.&nbsp;<\/li><li>Convert the FlatBuffer to a C byte array. The easiest way to do this manually is with the Linux xxd command.&nbsp;<\/li><li>Integrate the C++ Library. You need to include the <a href=\"https:\/\/www.tensorflow.org\/lite\/microcontrollers\/library\">TensorFlow Lite C++ library<\/a> in your build.&nbsp;<\/li><li>Deploy the model. You can choose to flash the model into the evaluation board\u2019s flash memory. Alternatively, you could load it onto an SD card.&nbsp;<\/li><\/ol>\n\n\n\n<h3 class=\"wp-block-heading\">Building the model<\/h3>\n\n\n\n<p>When building your <span id=\"urn:enhancement-4552e59f\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span>, it is advantageous to set your development board as the build target as this ensures:<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li>The IDE handles the process of converting the TensorFlow model to TensorFlow Lite and packaging it as a FlatBuffer.<\/li><li>The IDE calls the correct SDKs for the board. This automatically includes the <a href=\"https:\/\/www.nxp.com\/design\/software\/development-software\/eiq-ml-development-environment:EIQ\">eIQ SDK<\/a>, together with a large selection of machine learning resources and libraries, and ensures the TensorFlow Lite library is loaded.<\/li><li>The IDE can communicate with the board\u2019s debug port to read debug information.<\/li><\/ul>\n\n\n\n<p>Within MCUXpresso, click on the build icon to initiate all steps needed to convert the <span id=\"urn:enhancement-f2f360df\" class=\"textannotation disambiguated wl-thing\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/conceptual_model\">model<\/span> and load it to the development board. As with all Eclipse-based IDEs, the build icon is a small hammer&nbsp; <img decoding=\"async\" src=\"https:\/\/lh4.googleusercontent.com\/t2yqUVkT5asSjb3lp9GYAaehYgCC4TawiC-2EUU3OVE1M05QDRp57Y5uzdDMruNCe0-ddLYt2SOpynn3j1XSJRFQctFJLTcenMHiyJJTVa7cMDisyOptdN-EHFJ8EwTAhEp8woY\" width=\"32.453986287117004\" height=\"34.69330966472626\">.&nbsp;<\/p>\n\n\n\n<p>At this stage, you may encounter some problems if you didn\u2019t include the correct SDKs in your download or forgot to install them in the IDE. If that is the case, go back to the SDK download page and make sure you select all the required elements. Download the SDK package again. Now, in MCUXpresso, go to Window &gt; Show Views &gt; Installed SDKs which opens a new window in the IDE. To install the SDK, drag and drop the zip file into this view.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"383\" src=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image6-1024x383.png\" alt=\"\" class=\"wp-image-8582\" srcset=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image6-1024x383.png 1024w, https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image6-300x112.png 300w, https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image6-768x288.png 768w, https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image6-1536x575.png 1536w, https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/image6.png 1944w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><\/div>\n\n\n\n<h2 class=\"wp-block-heading\">Running the model<\/h2>\n\n\n\n<p>The Magic Wand model should now be loaded on to your evaluation board. As soon as the board is powered up, the model should start running and be ready to test. This requires you to connect to the board\u2019s serial port using <span id=\"urn:local-annotation-69486\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/putty\">PuTTY<\/span> or a similar terminal emulator.&nbsp;<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Connecting to the serial port<\/h3>\n\n\n\n<p>Plug the board into your computer using a USB cable connected to the <span id=\"urn:local-annotation-449200\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/debugging\">debug<\/span> port. If you used MCUXpresso to build and deploy the model, the board should already be connected. Now use Device Manager to identify which <span id=\"urn:local-annotation-878391\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/usb\">USB<\/span> port it is connected to and set up your terminal emulator with the following settings:<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li><em>Port<\/em>: Find the correct port for the USB cable connected to the board<\/li><li><em>Baud Rate<\/em>: 115200 bits per second<\/li><li><em>Data Bits<\/em>: 8<\/li><li><em>Stop Bits<\/em>: 1<\/li><li><em>Parity<\/em>: None<\/li><li><em>Flow Control<\/em>: None<\/li><\/ul>\n\n\n\n<p>If all is well, you should see the words <code>Magic starts!<\/code> appear on the screen.&nbsp;<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Testing gesture recognition<\/h3>\n\n\n\n<p>Pick up the board and draw a letter W in the air. You should see the following appearing on the terminal emulator:<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"752\" height=\"160\" src=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/wing.png\" alt=\"\" class=\"wp-image-8583\" srcset=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/wing.png 752w, https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/wing-300x64.png 300w\" sizes=\"auto, (max-width: 752px) 100vw, 752px\" \/><\/figure><\/div>\n\n\n\n<p>Now try drawing a circle in the air. This time you should see:<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"752\" height=\"179\" src=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ring-1.png\" alt=\"\" class=\"wp-image-8585\" srcset=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ring-1.png 752w, https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ring-1-300x71.png 300w\" sizes=\"auto, (max-width: 752px) 100vw, 752px\" \/><\/figure><\/div>\n\n\n\n<p>Finally, try drawing an angle in the air:<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"753\" height=\"222\" src=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/slope.png\" alt=\"\" class=\"wp-image-8586\" srcset=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/slope.png 753w, https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/slope-300x88.png 300w\" sizes=\"auto, (max-width: 753px) 100vw, 753px\" \/><\/figure><\/div>\n\n\n\n<h3 class=\"wp-block-heading\">Taking things a step further<\/h3>\n\n\n\n<p>Clearly, it\u2019s impractical to wave a large board about when using the Magic Wand code. Fortunately, you can buy much smaller devices capable of running this code. Then, with a little ingenuity and maybe a <span id=\"urn:local-annotation-79287\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/3d_printer\">3D printer<\/span>, it\u2019s possible to create a real wand. Alternatively, companies such as <a aria-label=\"undefined (opens in a new tab)\" href=\"https:\/\/kano.me\/eu\/store\/products\/coding-wand\" target=\"_blank\" rel=\"noreferrer noopener\">Kano<\/a> are producing suitable hardware, designed to teach children how to code. However, since the wand includes an accelerometer and is Bluetooth enabled, it is easy to see how you can use it with the NXP MCU board if you install an <a aria-label=\"undefined (opens in a new tab)\" href=\"https:\/\/wiki.seeedstudio.com\/Bluetooth_Shield\/\" target=\"_blank\" rel=\"noreferrer noopener\"><span id=\"urn:local-annotation-26582\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/arduino\">Arduino<\/span> <span id=\"urn:local-annotation-802084\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/bluetooth\">Bluetooth<\/span> shield<\/a>.&nbsp;<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Conclusions and what next<\/h2>\n\n\n\n<p><span id=\"urn:local-annotation-377047\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/edge_computing\">Edge<\/span> ML is an essential tool for enabling gesture recognition. Modern MCUs are powerful enough to run complex ML <span id=\"urn:local-annotation-98320\" class=\"textannotation disambiguated\" itemid=\"http:\/\/data.wordlift.io\/wl01770\/entity\/list_of_machine_learning_concepts\">algorithms<\/span> in real-time, enabling gesture recognition systems based on accelerometers, time of flight detectors, and even computer vision systems. The next article will show how edge ML can be used for person detection and explain how this can be extended to full facial recognition.<\/p>\n\n\n","protected":false},"excerpt":{"rendered":"<p>Machine learning is the primary methodology for delivering AI applications. In previous articles, I discussed the main reasons behind moving machine learning to the network edge. These include the need for real-time performance, security considerations, and a lack of connectivity. However, ML models typically require powerful hardware to run efficiently. Fortunately, in recent years, MCUs&#8230; <a class=\"more-link\" href=\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/\">Read more<\/a><\/p>\n","protected":false},"author":83,"featured_media":8694,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_editorskit_title_hidden":false,"_editorskit_reading_time":8,"_editorskit_is_block_options_detached":false,"_editorskit_block_options_position":"{}","_uag_custom_page_level_css":"","_genesis_hide_title":false,"_genesis_hide_breadcrumbs":false,"_genesis_hide_singular_image":false,"_genesis_hide_footer_widgets":false,"_genesis_custom_body_class":"","_genesis_custom_post_class":"","_genesis_layout":"","footnotes":""},"categories":[35],"tags":[3357,7105],"collections":[],"class_list":{"0":"post-8576","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-machine-learning","8":"tag-iot","9":"tag-mouser","10":"entry"},"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v26.9 (Yoast SEO v26.9) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>ML at the Edge: a Practical Example - Codemotion Magazine<\/title>\n<meta name=\"description\" content=\"This article is about Machine Learning at the Network Edge presents a practical implementation of ML using an NXP i.MX RT1050 evaluation kit.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"ML at the Edge: a Practical Example\" \/>\n<meta property=\"og:description\" content=\"This article is about Machine Learning at the Network Edge presents a practical implementation of ML using an NXP i.MX RT1050 evaluation kit.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/\" \/>\n<meta property=\"og:site_name\" content=\"Codemotion Magazine\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/Codemotion.Italy\/\" \/>\n<meta property=\"article:published_time\" content=\"2020-09-07T09:34:33+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2022-01-05T19:05:53+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1200\" \/>\n\t<meta property=\"og:image:height\" content=\"675\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Mark Patrick, Mouser Electronics\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@CodemotionIT\" \/>\n<meta name=\"twitter:site\" content=\"@CodemotionIT\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Mark Patrick, Mouser Electronics\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"9 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/\"},\"author\":{\"name\":\"Mark Patrick, Mouser Electronics\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/664e4da6990fc1344a2299435a542654\"},\"headline\":\"ML at the Edge: a Practical Example\",\"datePublished\":\"2020-09-07T09:34:33+00:00\",\"dateModified\":\"2022-01-05T19:05:53+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/\"},\"wordCount\":1923,\"publisher\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg\",\"keywords\":[\"IoT\",\"Mouser\"],\"articleSection\":[\"Machine Learning\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/\",\"name\":\"ML at the Edge: a Practical Example - Codemotion Magazine\",\"isPartOf\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg\",\"datePublished\":\"2020-09-07T09:34:33+00:00\",\"dateModified\":\"2022-01-05T19:05:53+00:00\",\"description\":\"This article is about Machine Learning at the Network Edge presents a practical implementation of ML using an NXP i.MX RT1050 evaluation kit.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#primaryimage\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg\",\"contentUrl\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg\",\"width\":1200,\"height\":675,\"caption\":\"ML at the edge\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.codemotion.com\/magazine\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI\/ML\",\"item\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Machine Learning\",\"item\":\"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"ML at the Edge: a Practical Example\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#website\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/\",\"name\":\"Codemotion Magazine\",\"description\":\"We code the future. Together\",\"publisher\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.codemotion.com\/magazine\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#organization\",\"name\":\"Codemotion\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/codemotionlogo.png\",\"contentUrl\":\"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/codemotionlogo.png\",\"width\":225,\"height\":225,\"caption\":\"Codemotion\"},\"image\":{\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/Codemotion.Italy\/\",\"https:\/\/x.com\/CodemotionIT\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/664e4da6990fc1344a2299435a542654\",\"name\":\"Mark Patrick, Mouser Electronics\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/0d35fad9fee01e991637b67f54ae7cb8b001b5d2c1e4f7c1942b2105dad5a9bf?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/0d35fad9fee01e991637b67f54ae7cb8b001b5d2c1e4f7c1942b2105dad5a9bf?s=96&d=mm&r=g\",\"caption\":\"Mark Patrick, Mouser Electronics\"},\"url\":\"https:\/\/www.codemotion.com\/magazine\/author\/mark-patrick\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"ML at the Edge: a Practical Example - Codemotion Magazine","description":"This article is about Machine Learning at the Network Edge presents a practical implementation of ML using an NXP i.MX RT1050 evaluation kit.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/","og_locale":"en_US","og_type":"article","og_title":"ML at the Edge: a Practical Example","og_description":"This article is about Machine Learning at the Network Edge presents a practical implementation of ML using an NXP i.MX RT1050 evaluation kit.","og_url":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/","og_site_name":"Codemotion Magazine","article_publisher":"https:\/\/www.facebook.com\/Codemotion.Italy\/","article_published_time":"2020-09-07T09:34:33+00:00","article_modified_time":"2022-01-05T19:05:53+00:00","og_image":[{"width":1200,"height":675,"url":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg","type":"image\/jpeg"}],"author":"Mark Patrick, Mouser Electronics","twitter_card":"summary_large_image","twitter_creator":"@CodemotionIT","twitter_site":"@CodemotionIT","twitter_misc":{"Written by":"Mark Patrick, Mouser Electronics","Est. reading time":"9 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#article","isPartOf":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/"},"author":{"name":"Mark Patrick, Mouser Electronics","@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/664e4da6990fc1344a2299435a542654"},"headline":"ML at the Edge: a Practical Example","datePublished":"2020-09-07T09:34:33+00:00","dateModified":"2022-01-05T19:05:53+00:00","mainEntityOfPage":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/"},"wordCount":1923,"publisher":{"@id":"https:\/\/www.codemotion.com\/magazine\/#organization"},"image":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#primaryimage"},"thumbnailUrl":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg","keywords":["IoT","Mouser"],"articleSection":["Machine Learning"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/","url":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/","name":"ML at the Edge: a Practical Example - Codemotion Magazine","isPartOf":{"@id":"https:\/\/www.codemotion.com\/magazine\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#primaryimage"},"image":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#primaryimage"},"thumbnailUrl":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg","datePublished":"2020-09-07T09:34:33+00:00","dateModified":"2022-01-05T19:05:53+00:00","description":"This article is about Machine Learning at the Network Edge presents a practical implementation of ML using an NXP i.MX RT1050 evaluation kit.","breadcrumb":{"@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#primaryimage","url":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg","contentUrl":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg","width":1200,"height":675,"caption":"ML at the edge"},{"@type":"BreadcrumbList","@id":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/machine-learning-edge-example\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.codemotion.com\/magazine\/"},{"@type":"ListItem","position":2,"name":"AI\/ML","item":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/"},{"@type":"ListItem","position":3,"name":"Machine Learning","item":"https:\/\/www.codemotion.com\/magazine\/ai-ml\/machine-learning\/"},{"@type":"ListItem","position":4,"name":"ML at the Edge: a Practical Example"}]},{"@type":"WebSite","@id":"https:\/\/www.codemotion.com\/magazine\/#website","url":"https:\/\/www.codemotion.com\/magazine\/","name":"Codemotion Magazine","description":"We code the future. Together","publisher":{"@id":"https:\/\/www.codemotion.com\/magazine\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.codemotion.com\/magazine\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.codemotion.com\/magazine\/#organization","name":"Codemotion","url":"https:\/\/www.codemotion.com\/magazine\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/logo\/image\/","url":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/codemotionlogo.png","contentUrl":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2019\/11\/codemotionlogo.png","width":225,"height":225,"caption":"Codemotion"},"image":{"@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/Codemotion.Italy\/","https:\/\/x.com\/CodemotionIT"]},{"@type":"Person","@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/664e4da6990fc1344a2299435a542654","name":"Mark Patrick, Mouser Electronics","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.codemotion.com\/magazine\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/0d35fad9fee01e991637b67f54ae7cb8b001b5d2c1e4f7c1942b2105dad5a9bf?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/0d35fad9fee01e991637b67f54ae7cb8b001b5d2c1e4f7c1942b2105dad5a9bf?s=96&d=mm&r=g","caption":"Mark Patrick, Mouser Electronics"},"url":"https:\/\/www.codemotion.com\/magazine\/author\/mark-patrick\/"}]}},"featured_image_src":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-600x400.jpg","featured_image_src_square":"https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-600x600.jpg","author_info":{"display_name":"Mark Patrick, Mouser Electronics","author_link":"https:\/\/www.codemotion.com\/magazine\/author\/mark-patrick\/"},"uagb_featured_image_src":{"full":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg",1200,675,false],"thumbnail":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-150x150.jpg",150,150,true],"medium":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-300x169.jpg",300,169,true],"medium_large":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-768x432.jpg",768,432,true],"large":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-1024x576.jpg",1024,576,true],"1536x1536":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg",1200,675,false],"2048x2048":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg",1200,675,false],"small-home-featured":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge.jpg",100,56,false],"sidebar-featured":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-180x128.jpg",180,128,true],"genesis-singular-images":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-896x504.jpg",896,504,true],"archive-featured":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-400x225.jpg",400,225,true],"gb-block-post-grid-landscape":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-600x400.jpg",600,400,true],"gb-block-post-grid-square":["https:\/\/www.codemotion.com\/magazine\/wp-content\/uploads\/2020\/09\/ML-at-the-edge-600x600.jpg",600,600,true]},"uagb_author_info":{"display_name":"Mark Patrick, Mouser Electronics","author_link":"https:\/\/www.codemotion.com\/magazine\/author\/mark-patrick\/"},"uagb_comment_info":0,"uagb_excerpt":"Machine learning is the primary methodology for delivering AI applications. In previous articles, I discussed the main reasons behind moving machine learning to the network edge. These include the need for real-time performance, security considerations, and a lack of connectivity. However, ML models typically require powerful hardware to run efficiently. Fortunately, in recent years, MCUs&#8230;&hellip;","lang":"en","_links":{"self":[{"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/posts\/8576","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/users\/83"}],"replies":[{"embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/comments?post=8576"}],"version-history":[{"count":8,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/posts\/8576\/revisions"}],"predecessor-version":[{"id":14244,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/posts\/8576\/revisions\/14244"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/media\/8694"}],"wp:attachment":[{"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/media?parent=8576"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/categories?post=8576"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/tags?post=8576"},{"taxonomy":"collections","embeddable":true,"href":"https:\/\/www.codemotion.com\/magazine\/wp-json\/wp\/v2\/collections?post=8576"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}