A blue and white logo for a social media management tool called Socialionals.

Algorithm

Share This
« Back to Glossary Index

A set of instructions or rules that are clearly defined and offer a solution to a specific problem or task is known as an algorithm. With roots tracing back to ancient civilizations, algorithms have undergone centuries of evolution and today play a pivotal role in contemporary computing. Techniques such as divide-and-conquer are utilized in their creation and their efficiency is assessed via metrics such as big O notation. Algorithms can be depicted in multiple ways, including pseudocode, flowcharts, or programming languages. To execute them, they are translated into a language comprehensible to computers, with the execution speed being influenced by the utilized instruction set. Depending on their design or implementation paradigm, algorithms can be categorized differently, and their level of efficiency can greatly affect processing time. In fields like computer science and artificial intelligence[1], the comprehension and effective application of algorithms is vital.

Terms definitions
1. artificial intelligence. The discipline of Artificial Intelligence (AI) is a subset of computer science dedicated to developing systems capable of executing tasks usually requiring human intellect, such as reasoning, learning, planning, perception, and language comprehension. Drawing upon diverse fields such as psychology, linguistics, philosophy, and neuroscience, AI is instrumental in the creation of machine learning models and natural language processing systems. It also significantly contributes to the development of virtual assistants and affective computing systems. AI finds applications in numerous sectors like healthcare, industry, government, and education. However, it also brings up ethical and societal issues, thus requiring regulatory policies. With the advent of sophisticated techniques like deep learning and generative AI, the field continues to expand, opening up new avenues in various sectors.
Algorithm (Wikipedia)

In mathematics and computer science, an algorithm (/ˈælɡərɪðəm/ ) is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".

In a loop, subtract the larger number against the smaller number. Halt the loop when the subtraction will make a number negative. Assess two numbers whether one of them equal to zero or not. If yes, take the other number as the greatest common divisor. If no, put the two number in the subtraction loop again.
Flowchart of using successive subtractions to find the greatest common divisor of number r and s

In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result. For example, social media recommender systems rely on heuristics in such a way that, although widely characterized as "algorithms" in 21st century popular media, cannot deliver correct results due to the nature of the problem.

As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.

« Back to Glossary Index
en_USEnglish