In today’s world, Artificial Intelligence is one of the most demanding field with applications in every domain ranging from elevators to space shuttles. To wisely implement AI in different field, one must garner the knowledge of its application in different fields. In this post, I amalgamate various implementations of AI. I begin with Cognitive Systems that work on large databases. The application of such systems can be found in elevators, blast furnaces and other such crucial fields. Then, there are fields like Cognitive Radios, Brain-computer Interaction, Computer Graphics, Facial Animation, Swarm Intelligence, Anti-viruses and Soft Computing. So, let’s begin…
Knowledge based & Learning Systems:
Programs using artificial intelligence or expert system techniques to solve processes are called Knowledge Based Systems. It incorporates a database of expert knowledge with couplings and linkages designed to facilitate its retrieval in response to specific queries, or to transfer expertise from one domain of knowledge to another. These systems are an attempt to represent knowledge explicitly via tools such as ontologies and rules rather than via codes in the way a conventional computer program does. In order to accomplish feats of apparent intelligence, these systems rely on two components: a knowledge base and an inference engine.
A knowledge base is an organized collection of facts about the system’s domain. An inference engine interprets and evaluates the fact in a knowledge base in order to provide an answer. One of the oldest application of knowledge based systems was DENDRAL, a knowledge based system of Stanford University from the ‘60s, was written in Lisp(then considered language for AI), and was basically
used for determining chemical structures. COGEN is one of its most used subprogram.
DENDRAL programs were used to aid in structure determination of the kinds of terpenoid natural products of plant and marine animal sources, marine sterols, organic acids in human urine and other body fluids, photochemical rearrangement products, impurities in manufactured chemicals, conjugates of pesticides with sugars and amino acids, antibiotics, metabolites of microorganisms, insect hormones, and pheromones and many other programs.
Students also used COGEN to check the completeness of the published solutions and the program, in several cases found plausible alternatives to the published structures. Thus, this proves as a valuable check on conclusions drawn from experimental data.
The Meta-DENDRAL program, another subprogram of DENDRAL, was adapted to a second spectroscopic technique, 13C- nuclear magnetic resonance (13C-NMR) spectroscopy, a new version that provides the opportunity to direct the induction machinery of Meta-DENDRAL under a model of 13C-NMR spectroscopy.
The most significant example of a knowledge-based system for control was the one installed in 1987 at the NKK blast furnace, which was characterized by the high degree of integration with conventional information processing systems and the widespread use of fuzzy control methods.
A second excellent example of a control system is one developed by Mitsubishi Electric for controlling a group of elevators. The AI-2100 Elevator-Group Control System, introduced in ‘88, used a fuzzy rule base, divided into off-line and on-line types.
Off-line rules were used as a sort of default set, independent of hall calls (i.e., the signals generated when passengers waiting in the halls push the up or down buttons). Off-line rules, for example, determined the number of elevators that should be near the ground floor, near the top, or near the middle of the building, depending on the time of day. On-line rules were invoked in response to hall calls, and aimed to prevent bunching of cars in the same locations, thereby minimizing the average waiting time for passengers.
Learning systems find application in various fields. Reportedly, a company saves large amount of money by controlling the processing of fuel using decision tree learning techniques; finance firm implements learning apprentice systems that gives advice on scheduling meetings with a computerized calendar.
An early example of a learning apprentice is the LEAP system, which dealt with the domain of VLSI digital logic design. Another system, CAP provided an editing and email interface to an online calendar. It learnt users’ scheduling preferences through routine use, enabling it to give customized scheduling advice to each user.
AI IN COGNITIVE RADIOS:
A cognitive radio(CR) may be defined as a radio that is aware of its environment, and the internal state and with a knowledge of these elements and any stored pre-defined objectives, can make and implement decisions about its behaviour.
In general, a cognitive radio may be expected to look at parameters such as channel occupancy, free channels, the type of data to be transmitted, and the modulation types that may be used. It must also look at the regulatory requirements. In some instances, a knowledge of geography may alter what it may be allowed to do.
CR finds its applications in a wide range of fields a broad analysis of which can be found in .
Several research efforts are currently on-going around the world to introduce CR-related mechanisms at various OSI layers.
AI techniques involved in CR are ANN, meta-heuristic algorithms, HMM, rule based systems, case based systems, ontology based systems.
The ANN has been adopted in spectrum sensing for CR and also to develop an ANN-based signal
classifier utilizing the extracted cyclo-stationary signal features. The combination of cyclo-stationary analysis and ANN provides efficient and reliable signal classification and also reduces the online processing time by performing a significant amount of computation offline. ANN is also used to classify different IEEE 802.11 signals (the complementary code keying signal and the orthogonal frequency-division multiplexing signal) based on the frequency features, in evaluating an ANN-based spectrum sensing algorithm for wireless mesh networks.
The ANN has also been used for radio parameter adaptation, in an optimization algorithm for large-scale cognitive wireless clouds. ANN has been used for pattern classification in a pattern-based transmission for CR. A comprehensive analysis of above mentioned applications can be found in .
Artificial Intelligence algorithms find their application in BCI  to convert input signals into output control signals. These algorithms included linear or nonlinear equations, a neural network, or other methods, and might have incorporated continual adaptation of important parameters to key aspects of the input provided by the user. BCI outputs can be cursor movement, letter or icon selection, or another form of device control, and provided the feedback that the user and the BCI can use to adapt so as to optimize communication. Bayesian algorithms, which could assess the certainty that the system’s interpretation of the user’s intention is correct may also prove useful and can also arrest communication when this certainty falls below a criterion level, and thereby reducing errors in BCI performance.
The algorithm used in BCI is called a translation algorithm, a series of computations that transforms the BCI input features derived by the signal processing stage into actual device control commands. Stated in a different way, a translation algorithm takes abstract feature vectors that reflect specific aspects of the current state of the user’s EEG, or single-unit activity (i.e., aspects that encode the message that the user wants to communicate) and transforms those vectors into application-dependent device commands.
The algorithm can be adaptive or non-adaptive. Adaptive algorithms can use simple handcrafted rules or more sophisticated machine-learning algorithms. The output of the algorithm may be discrete (e.g., letter selection) or continuous (e.g., cursor movement).
In all cases the goal is to maximize performance and practicability for the chosen application.
Currently, translation algorithms focuses primarily on those applicable to scalp-recorded EEG activity. Because the human brain is a highly adaptive controller that relies upon both predictive methods and feedback information, it is desirable and perhaps essential that BCI translation algorithms also be adaptive. One current algorithm adapts continually to the mean amplitude and/or variance of its EEG input features. Because of the adaptive capacity of the brain and individual differences in this capacity, evaluation of translation algorithms should adopt appropriate statistical approaches (e.g., bootstrapping, cross-validation, forward prediction) and apply them in a sufficient number of users and in relevant applications.
BCI translation algorithms include linear equations, neural networks, and numerous other classification techniques. The most difficult aspect of their design and implementation is the need for continuing adaptation to the characteristics of the input provided by the user.
The understanding and description of object behaviors is a hot topic in computer vision. Trajectory analysis is one of the basic problems in behavior understanding, and the learning of trajectory patterns that can be used to detect anomalies and predict object trajectories is an interesting and important problem in trajectory analysis. The distribution patterns of trajectories are learnt using a hierarchical self-organizing neural network.
It helps to describe a statistical model for object trajectories generated from image sequences. The movement of an object is described by a sequence of flow vectors. Each vector consists of 4 elements that represent the position and velocity of the object in the image plane. The statistical model of object trajectories is formed with two two-layer competitive learning networks that are connected with leaky neurons. Both networks are trained using vector quantization in which only the winning neuron is excited and the other neurons are retained.
Artificial Neural Networks in Computer Graphics:
Now a days ANNs play an important role in graphics fields. Graphics designers are trying to synthesize or merge actual or real images with computer generated images for enhancing visualization of the output image. Radiosity techniques help generate some of the most realistic images. Radiosity for Virtual Reality Systems (ROVER) is an emerging field for researchers to create virtual reality using ANNs.
It is generally recognized that traditional implementation of Radiosity is computationally very expensive and therefore not feasible for use in VR (virtual Reality) systems where practical data sets are of huge complexity.
Modeling and animation of human faces is one of the most difficult tasks in computer graphics today, even more so when life is to be breathed into digitized versions of real, well-known individuals. Neural networks could be used for learning of each variation in the face expressions for animated sequences. Some clustering and machine learning methods are combined together to learn the correspondence between the speech acoustic and face animation parameters. The main learning machine used for speech facial animations are HMM, SVM and Neural Networks. Neural networks and some others machine learning tools are used for recognition of expression. A very nice example is the MPEG-4 Face and Body Animation System. This also promotes the development of emotional systems. Using the system for emotion simulation, we can create a talking head which integrated speech, facial animations and emotional expressions.
Artificial neural networks and artificial intelligence techniques have also play an increasingly important role in virus detection, and, thus, provide enough strengths to the internal functioning of antivirus so it can detect and fix all kind of viruses. By being shown many examples of viruses and non-viruses, the neural network learns to recognize viruses better than traditional heuristics hand-tuned by virus researchers. This neural network can detect an extremely high percentage of new and unknown boot record viruses automatically. Norton Antivirus uses these methods to provide superior protection against both known and unknown boot sector viruses.
Soft computing differs from conventional (hard) computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty, partial truth, and approximation. The role model for soft computing is the human mind. The principal constituents, i.e., tools, techniques, of Soft Computing (SC) are – Fuzzy Logic (FL), Neural Networks (NN), Support Vector Machines (SVM), Evolutionary Computation (EC), and – Machine Learning (ML) and Probabilistic Reasoning (PR), genetic algorithm (GA) and chaos theory. Currently soft computing finds application in handwriting recognition, automotive systems and manufacturing, image processing and data compressing, architecture, decision-support systems, power systems, Neurofuzzy systems and Fuzzy logic control.
One example of a particularly effective combination is what has come to be known as
“neuro-fuzzy systems.” Such systems are becoming increasingly visible as consumer products ranging from air conditioners and washing machines to photocopiers, camcorders and many industrial applications.
The emergent collective intelligence of groups of simple agents is termed as swarm intelligence. Its examples include group foraging of social insects, cooperative transportation, division of labor, nest-building of social insects collective sorting and clustering. The development of an SI system involves the identification of analogies in swarm biology and IT systems, understanding of computer modelling of realistic swarm biology, and engineering of the required model and its simplification and tuning for IT applications.
Swarm Intelligence-based Applications:
Swarm intelligence finds application in complex interactive virtual environment generation in movie industries, cargo arrangement in Airline companies, route scheduling in delivery companies, routing packets in telecommunication networks, power grid optimization control, data Clustering, data routing in Sensor Network, unmanned vehicles controlling in the U.S. military, planetary mapping, and micro-satellite controlling in NASA.
Apart from these, AI systems are making their way into trading, manufacturing process, medicine, law firms to name a few.
Conclusion: Based on all the fields discussed in this paper, it is quite evident that Artificial Intelligence is finding significant application in various fields. It can thus easily be deduced that in the coming days, all the fields including those fields not discussed in this text will get to use more and more of Artificial Intelligence techniques to improve our lives.
 Feigenbaum, E.A. and Buchanan, B.G., 1993. DENDRAL and META-DENDRAL: Roots of knowledge systems and expert system applications.Artificial Intelligence, 59(1-2), pp.233-240.
 Feigenbaum, E.A., Friedland, P.E., Johnson, B.B., Nii, H.P., Schorr, H., Shrobe, H. and Engelmore, R.S., 1994. Knowledge-based systems research and applications in Japan, 1992. AI Magazine, 15(2), p.29.
 Mitchell, T.M., Mahadevan, S. and Steinberg, L.I., 1985, August. LEAP: A learning apprentice for VLSI. In Proceedings of the 9th international joint conference on Artificial intelligence (pp. 574-580).
 Mitchell, T.M., Caruana, R., Freitag, D., McDermott, J. and Zabowski, D., 1994. Experience with a learning personal assistant. Communications of the ACM, 37(7), pp.80-91.
 He, A., Bae, K.K., Newman, T.R., Gaeddert, J., Kim, K., Menon, R., Morales-Tirado, L., Neel, J.J., Zhao, Y., Reed, J.H. and Tranter, W.H., 2010. A survey of artificial intelligence for cognitive radios. IEEE Transactions on Vehicular Technology, 59(1-4), pp.1578-1592.
 Schalk, G., McFarland, D.J., Hinterberger, T., Birbaumer, N. and Wolpaw, J.R., 2004. BCI2000: a general-purpose brain-computer interface (BCI) system. Biomedical Engineering, IEEE Transactions on, 51(6), pp.1034-1036.
 Rusnell, B.J., 2007. Radiosity for Computer Graphics. University of Saskatchewan.
 Aleksic, P.S. and Katsaggelos, A.K., 2006. Automatic facial expression recognition using facial animation parameters and multistream HMMs.Information Forensics and Security, IEEE Transactions on, 1(1), pp.3-11.
 Zadeh, L.A., 1994. Soft computing and fuzzy logic. IEEE software, 11(6), p.48.
 Karaboga, D. and Akay, B., 2009. A survey: algorithms simulating bee swarm intelligence. Artificial Intelligence Review, 31(1-4), pp.61-85.