Autonomic Computing is a vision initiated by IBM in 2001 to build a large scale distributed system without any human interaction. It was developed to beat the complications of computer systems management and improve self- management for the computer systems.
Through high-level policies, it is designed to make flexible decisions which will continuously inspect and optimize its systems to any dynamic conditions. This means that a computer can automatically manage itself with the help of its computing capabilities, which is designed like a human body’s nervous system. It reacts to stimuli independent of the individual’s conscious with a high level of artificial intelligence while being hidden from the users.
This new paradigm defines a new definition of technology which is characterized by unified data rather than basic computing. Users will be able to access the information quickly due to centralized storage devices and access to data from multiple & distributed sources.
Vital elements of autonomic computing are:
- Self-organizing network
- Self-healing systems
ARCHITECTURE OF AUTONOMIC COMPUTING
It achieves four event-driven tasks:
- Collecting the requirement of the application from the environment sensors, making decisions and then making the necessary adjustments.It defines two primary components of the systems, an autonomic manager and all the managed resources.
- Analyzing the process by applying high-level artificial intelligence after collecting all the essential texts.
- Deciding the mechanisms about the actions that need to be taken to achieve the goals and objectives.
- Actions to execute the plan by making strategies and by managed elements.
Autonomic Manager implements the control loop. Their main goal is to preserve correct software architecture by making the four parts work together and enhance autonomic loop functionality. Using variation points, they can delay the design conclusions by externalizing the components of architecture and combining them. They absorb and create knowledge. This knowledge is about the characteristics of the managed resources and is shared continuously among the four parts.
It is a controlled component of the system. It is the core element of the autonomic computing architecture. There are single managed resources such as database server, or router and a collection of resources like a pool of servers, cluster, or business application. The autonomic manager communicates with the managed resource touchpoint. It is the implementation of an interface to employ basic benchmarks and analogies.
The manageability interface is divided into sensor and effector operations. They enable the autonomic manager’s work by transmitting events or properties, while effector is used to manage any sort of the change in the state of data. The interactions styles performed by sensor and effector operations are:
- Sensor healing-state
- Sensor receive-notification
- Effector perform-operation
- Effector call-out-request
There are various objectives of autonomic element that need to be implemented. The computation skills must be performed with minimum inhibitors.
CONDITIONS DEFINED BY IBM THAT EXPLAIN AN AUTONOMIC SYSTEM:
- The system must maintain the resources it has access to and specific knowledge about all the capabilities and limitations.
- It should have the ability to configure and reconfigure automatically in the unforeseeable conditions
- The system must monitor and regularize its performance for optimal functioning to ensure the most efficient computing process.
- The system must identify, detect threats and protect itself to maintain system integrity and security.
- The system must be adaptable to any environmental changes and be able to determine protocols for communication.
- The system cannot rely on a proprietary environment; rather, it should be based on open standards.
- It must maintain transparency to the users while predicting the demand.
The Architecture of ACI is a prominent holistic approach which is aimed to bringing a new level of automation. It is one of the core blocks of pervasive computing where it is predicted that there will be invisible computers around everybody performing the same functions as the present ones and will function through highly interconnected networks.
You may also like to read: