{"id":146,"date":"2023-05-14T12:05:13","date_gmt":"2023-05-14T12:05:13","guid":{"rendered":"https:\/\/visualclpl.com\/?p=146"},"modified":"2023-05-14T12:05:14","modified_gmt":"2023-05-14T12:05:14","slug":"mathematics-of-activation-function","status":"publish","type":"post","link":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/","title":{"rendered":"Mathematics of Activation Function"},"content":{"rendered":"\n<p>Activation functions play a crucial role in artificial neural networks (ANNs) by introducing nonlinearity to the model, enabling it to learn complex patterns and solve a wide range of problems. The mathematics behind activation functions involves various functions that transform the input signal (weighted sum of inputs) into an output signal. Here, we will discuss a few common activation functions and their mathematical formulations:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Sigmoid (logistic) function: The sigmoid function maps input values to the range (0, 1), making it suitable for binary classification tasks or probabilities. Mathematically, it is defined as:<\/li>\n<\/ol>\n\n\n\n<p>\u03c3(x) = 1 \/ (1 + exp(-x))<\/p>\n\n\n\n<ol class=\"wp-block-list\" start=\"2\">\n<li>Hyperbolic tangent (tanh) function: The tanh function is similar to the sigmoid function but maps input values to the range (-1, 1), providing a balanced output with a mean of 0. It is defined as:<\/li>\n<\/ol>\n\n\n\n<p>tanh(x) = (exp(x) &#8211; exp(-x)) \/ (exp(x) + exp(-x))<\/p>\n\n\n\n<ol class=\"wp-block-list\" start=\"3\">\n<li>Rectified Linear Unit (ReLU) function: ReLU is a popular activation function due to its simplicity and computational efficiency. It simply sets negative input values to 0, while positive values remain unchanged. Mathematically, it is defined as:<\/li>\n<\/ol>\n\n\n\n<p>ReLU(x) = max(0, x)<\/p>\n\n\n\n<ol class=\"wp-block-list\" start=\"4\">\n<li>Leaky Rectified Linear Unit (Leaky ReLU) function: Leaky ReLU is a modification of the ReLU function that allows a small negative slope when the input is negative, which can help mitigate the &#8220;dying ReLU&#8221; problem (neurons becoming inactive during training). It is defined as:<\/li>\n<\/ol>\n\n\n\n<p>Leaky_ReLU(x) = max(\u03b1x, x), where \u03b1 is a small constant (e.g., 0.01)<\/p>\n\n\n\n<ol class=\"wp-block-list\" start=\"5\">\n<li>Exponential Linear Unit (ELU) function: The ELU function is similar to the ReLU but has a smooth exponential curve for negative input values, which can help alleviate the vanishing gradient problem. It is defined as:<\/li>\n<\/ol>\n\n\n\n<p>ELU(x) = x, if x &gt; 0 ELU(x) = \u03b1(exp(x) &#8211; 1), if x \u2264 0, where \u03b1 is a scaling constant (e.g., 1)<\/p>\n\n\n\n<p>These are just a few examples of activation functions used in neural networks. The choice of activation function depends on the specific problem and the desired properties of the network, such as smoothness, computational efficiency, and output range.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Activation functions play a crucial role in artificial neural networks (ANNs) by introducing nonlinearity to the model, enabling it to learn complex patterns and solve a wide range of problems. The mathematics behind activation functions involves various functions that transform the input signal (weighted sum of inputs) into an output signal. Here, we will discuss &hellip;<\/p>\n<p class=\"read-more\"> <a class=\"\" href=\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/\"> <span class=\"screen-reader-text\">Mathematics of Activation Function<\/span> Read More &raquo;<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"default","ast-global-header-display":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","footnotes":""},"categories":[1],"tags":[],"class_list":["post-146","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v22.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Mathematics of Activation Function - Visual Cognition Laboratory<\/title>\n<meta name=\"description\" content=\"Activation functions are a key component of artificial neural networks. They are used to introduce non-linearity into the network, which allows it to learn more complex patterns. In this article, we will discuss the mathematics of activation functions in detail. We will cover the different types of activation functions, their properties, and how to choose the right activation function for your neural network.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Mathematics of Activation Function - Visual Cognition Laboratory\" \/>\n<meta property=\"og:description\" content=\"Activation functions are a key component of artificial neural networks. They are used to introduce non-linearity into the network, which allows it to learn more complex patterns. In this article, we will discuss the mathematics of activation functions in detail. We will cover the different types of activation functions, their properties, and how to choose the right activation function for your neural network.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/\" \/>\n<meta property=\"og:site_name\" content=\"Visual Cognition Laboratory\" \/>\n<meta property=\"article:published_time\" content=\"2023-05-14T12:05:13+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-05-14T12:05:14+00:00\" \/>\n<meta name=\"author\" content=\"avinash\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@visualclpl\" \/>\n<meta name=\"twitter:site\" content=\"@visualclpl\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"avinash\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"2 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/\"},\"author\":{\"name\":\"avinash\",\"@id\":\"https:\/\/visualclpl.com\/#\/schema\/person\/b72280c8a071d7b87744543294e949eb\"},\"headline\":\"Mathematics of Activation Function\",\"datePublished\":\"2023-05-14T12:05:13+00:00\",\"dateModified\":\"2023-05-14T12:05:14+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/\"},\"wordCount\":330,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/visualclpl.com\/#organization\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/\",\"url\":\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/\",\"name\":\"Mathematics of Activation Function - Visual Cognition Laboratory\",\"isPartOf\":{\"@id\":\"https:\/\/visualclpl.com\/#website\"},\"datePublished\":\"2023-05-14T12:05:13+00:00\",\"dateModified\":\"2023-05-14T12:05:14+00:00\",\"description\":\"Activation functions are a key component of artificial neural networks. They are used to introduce non-linearity into the network, which allows it to learn more complex patterns. In this article, we will discuss the mathematics of activation functions in detail. We will cover the different types of activation functions, their properties, and how to choose the right activation function for your neural network.\",\"breadcrumb\":{\"@id\":\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/visualclpl.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Mathematics of Activation Function\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/visualclpl.com\/#website\",\"url\":\"https:\/\/visualclpl.com\/\",\"name\":\"Visual Cognition Laboratory\",\"description\":\"Empowering Businesses With Intelligent Computer Vision Technology\",\"publisher\":{\"@id\":\"https:\/\/visualclpl.com\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/visualclpl.com\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/visualclpl.com\/#organization\",\"name\":\"Visual Cognition Laboratory\",\"url\":\"https:\/\/visualclpl.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/visualclpl.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/visualclpl.com\/wp-content\/uploads\/2024\/03\/cropped-logo.png\",\"contentUrl\":\"https:\/\/visualclpl.com\/wp-content\/uploads\/2024\/03\/cropped-logo.png\",\"width\":466,\"height\":406,\"caption\":\"Visual Cognition Laboratory\"},\"image\":{\"@id\":\"https:\/\/visualclpl.com\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/twitter.com\/visualclpl\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/visualclpl.com\/#\/schema\/person\/b72280c8a071d7b87744543294e949eb\",\"name\":\"avinash\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/visualclpl.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/f9ebe93b451d0ac9b8d8f2a50a8acb37eb8b47bd0bee35c384e0eb6d24a378c0?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/f9ebe93b451d0ac9b8d8f2a50a8acb37eb8b47bd0bee35c384e0eb6d24a378c0?s=96&d=mm&r=g\",\"caption\":\"avinash\"},\"sameAs\":[\"http:\/\/visualclpl.com\"],\"url\":\"https:\/\/visualclpl.com\/index.php\/author\/avinash\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Mathematics of Activation Function - Visual Cognition Laboratory","description":"Activation functions are a key component of artificial neural networks. They are used to introduce non-linearity into the network, which allows it to learn more complex patterns. In this article, we will discuss the mathematics of activation functions in detail. We will cover the different types of activation functions, their properties, and how to choose the right activation function for your neural network.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/","og_locale":"en_US","og_type":"article","og_title":"Mathematics of Activation Function - Visual Cognition Laboratory","og_description":"Activation functions are a key component of artificial neural networks. They are used to introduce non-linearity into the network, which allows it to learn more complex patterns. In this article, we will discuss the mathematics of activation functions in detail. We will cover the different types of activation functions, their properties, and how to choose the right activation function for your neural network.","og_url":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/","og_site_name":"Visual Cognition Laboratory","article_published_time":"2023-05-14T12:05:13+00:00","article_modified_time":"2023-05-14T12:05:14+00:00","author":"avinash","twitter_card":"summary_large_image","twitter_creator":"@visualclpl","twitter_site":"@visualclpl","twitter_misc":{"Written by":"avinash","Est. reading time":"2 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/#article","isPartOf":{"@id":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/"},"author":{"name":"avinash","@id":"https:\/\/visualclpl.com\/#\/schema\/person\/b72280c8a071d7b87744543294e949eb"},"headline":"Mathematics of Activation Function","datePublished":"2023-05-14T12:05:13+00:00","dateModified":"2023-05-14T12:05:14+00:00","mainEntityOfPage":{"@id":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/"},"wordCount":330,"commentCount":0,"publisher":{"@id":"https:\/\/visualclpl.com\/#organization"},"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/","url":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/","name":"Mathematics of Activation Function - Visual Cognition Laboratory","isPartOf":{"@id":"https:\/\/visualclpl.com\/#website"},"datePublished":"2023-05-14T12:05:13+00:00","dateModified":"2023-05-14T12:05:14+00:00","description":"Activation functions are a key component of artificial neural networks. They are used to introduce non-linearity into the network, which allows it to learn more complex patterns. In this article, we will discuss the mathematics of activation functions in detail. We will cover the different types of activation functions, their properties, and how to choose the right activation function for your neural network.","breadcrumb":{"@id":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/visualclpl.com\/index.php\/2023\/05\/14\/mathematics-of-activation-function\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/visualclpl.com\/"},{"@type":"ListItem","position":2,"name":"Mathematics of Activation Function"}]},{"@type":"WebSite","@id":"https:\/\/visualclpl.com\/#website","url":"https:\/\/visualclpl.com\/","name":"Visual Cognition Laboratory","description":"Empowering Businesses With Intelligent Computer Vision Technology","publisher":{"@id":"https:\/\/visualclpl.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/visualclpl.com\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/visualclpl.com\/#organization","name":"Visual Cognition Laboratory","url":"https:\/\/visualclpl.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/visualclpl.com\/#\/schema\/logo\/image\/","url":"https:\/\/visualclpl.com\/wp-content\/uploads\/2024\/03\/cropped-logo.png","contentUrl":"https:\/\/visualclpl.com\/wp-content\/uploads\/2024\/03\/cropped-logo.png","width":466,"height":406,"caption":"Visual Cognition Laboratory"},"image":{"@id":"https:\/\/visualclpl.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/twitter.com\/visualclpl"]},{"@type":"Person","@id":"https:\/\/visualclpl.com\/#\/schema\/person\/b72280c8a071d7b87744543294e949eb","name":"avinash","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/visualclpl.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/f9ebe93b451d0ac9b8d8f2a50a8acb37eb8b47bd0bee35c384e0eb6d24a378c0?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/f9ebe93b451d0ac9b8d8f2a50a8acb37eb8b47bd0bee35c384e0eb6d24a378c0?s=96&d=mm&r=g","caption":"avinash"},"sameAs":["http:\/\/visualclpl.com"],"url":"https:\/\/visualclpl.com\/index.php\/author\/avinash\/"}]}},"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false},"uagb_author_info":{"display_name":"avinash","author_link":"https:\/\/visualclpl.com\/index.php\/author\/avinash\/"},"uagb_comment_info":24,"uagb_excerpt":"Activation functions play a crucial role in artificial neural networks (ANNs) by introducing nonlinearity to the model, enabling it to learn complex patterns and solve a wide range of problems. The mathematics behind activation functions involves various functions that transform the input signal (weighted sum of inputs) into an output signal. Here, we will discuss&hellip;","_links":{"self":[{"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/posts\/146","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/comments?post=146"}],"version-history":[{"count":1,"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/posts\/146\/revisions"}],"predecessor-version":[{"id":147,"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/posts\/146\/revisions\/147"}],"wp:attachment":[{"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/media?parent=146"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/categories?post=146"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/visualclpl.com\/index.php\/wp-json\/wp\/v2\/tags?post=146"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}