<![CDATA[IMDEEPAK]]>https://imdeepak.comhttps://cdn.hashnode.com/res/hashnode/image/upload/v1629998202726/hKLnb3TMD.pngIMDEEPAKhttps://imdeepak.comRSS for NodeFri, 14 Jun 2024 16:20:11 GMT60<![CDATA[The Notion of a Good Algorithm]]>https://imdeepak.com/the-notion-of-a-good-algorithmhttps://imdeepak.com/the-notion-of-a-good-algorithmFri, 18 Mar 2022 19:10:48 GMT<![CDATA[<p><strong>What is an Algorithm?</strong></p><p>An algorithm is a well-defined computational procedure that takes some value, or a set of values as input and produces some value, or a set of values, as output.</p><p><strong>What is a good algorithm?</strong></p><p>A good algorithm is measured by its running time and space utilization. In Laymen's terms:: "<em>The lesser the better</em>".</p><p><strong>Can we measure the run time and space utilization of any Algorithm? If Yes, how exactly?</strong></p><p>Yes, we can measure the efficiency of an algorithm by simply determining the Running Time/Memory Space needed by the algorithm to run to completion.</p><p>Now, this can be measured in 2 ways-</p><ol><li><strong>Time Complexity</strong>:: The time complexity of an algorithm is the amount of time it needs to run to completion.</li><li><strong>Space Complexity</strong>:: The space complexity of an algorithm is the amount of space it needs to run to completion</li></ol><p>Also, along with this, we should be mindful of the efficiency of algorithms.</p><ol><li><strong>Worst-case efficiency</strong>: It is the maximum number of steps that an algorithm has to take for any collection of data values.</li><li><strong>Best case efficiency</strong>: It is the minimum number of steps that an algorithm has to take any collection of data values.</li><li><strong>Average case efficiency</strong>: It can be defined as the efficiency averaged on all possible inputs.</li></ol><p>A graphical representation of the Worst, Best and Average cases of algorithms.<img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1647617735287/ZXKxNtQbU.png" alt="image.png" /></p><p><strong>Analysis of Algorithm</strong></p><p>Let's study different methods used in the analysis of algorithms.</p><p><strong>Asymptotic Analysis (Growth of function)</strong></p><p>It simply means a line that continuously approaches a given curve. However, never meets at any finite distance.</p><p>For example - Let's look at this picture, here X is asymptotic with X+1 as shown in the graph.<img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1647624968497/OD6ykUtpM.png" alt="image.png" /></p><p>Asymptotic may also be defined as a way to describe the behaviour of functions in the limit or without bounds.</p><p>Let f(x) and g(x) be two functions of real numbers.We say that f and g are asymptotic and write f(x) g(x) if<img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1647624907189/R4SxXr9sA.png" alt="image.png" /></p><p><strong>Asymptotic Notations</strong>In Asymptotic notations, we ignore the smaller inputs and constants while figuring out complexity.</p><ol><li><strong>Big-oh notation</strong>It provides the upper bound for f(n) and gives the worst-case complexity i.e. the measure of the longest amount of time.We say that f(n) = O(g(n)), if and only if there are positive constants c and n0 such that<img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1647628917239/9neys5h7C.png" alt="image.png" />If f(n) = O(g(n)), we say that g(n) is an upper bound on f(n).<img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1647628858148/HftHB97GT.png" alt="image.png" />For Example - <pre><code>F(n) <span class="hljs-operator">=</span> 3n<span class="hljs-operator">+</span><span class="hljs-number">2</span> <span class="hljs-keyword">as</span> 3n<span class="hljs-operator">+</span><span class="hljs-number">2</span>4n <span class="hljs-keyword">for</span> all n<span class="hljs-number">2</span> <span class="hljs-operator">=</span>O(n)F(n) <span class="hljs-operator">=</span> 3n<span class="hljs-operator">+</span><span class="hljs-number">3</span> <span class="hljs-keyword">as</span> 3n<span class="hljs-operator">+</span><span class="hljs-number">3</span>4n <span class="hljs-keyword">for</span> all n<span class="hljs-number">3</span> <span class="hljs-operator">=</span>O(n)</code></pre></li></ol><p><strong>NOTE :: K*f(n) is O(f(n))</strong> [i.e., constant coefficients can be dropped]G(n) = 7n4 is O(n4)That is, if we have a function multiplied by a constant, we can ignore the constant in the big-O.</p><ol><li><strong>Big-Omega Notation ()</strong></li></ol><p>It provides the lower bound for f(n) and gives the best-case complexity i.e. the measure of the shortest amount of time.We can say that f(n) = (g(n)), if and only if there are positive constants c and n0 such that<img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1647629853782/fm7F-ir5F.png" alt="image.png" />If f(n) = (g(n)), we say that g(n) is a lower bound on f(n).<img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1647629721303/Na0tjf-de.png" alt="image.png" /></p>]]>https://cdn.hashnode.com/res/hashnode/image/unsplash/xG8IQMqMITM/upload/v1647613767539/LJtmJ_0tN.jpeg