The term neural network was usually used to refer to a network or circuit of organic neurons. The modern usage of the term often refers to non-natural neural networks, which are composed of artificial neurons or nodes. Thus the term may refer to either biological neural networks, made up of real biological neurons, or artificial neural networks, for solving artificial intelligence problems.
- A computing system is made up of a number of simple, highly interconnected processing elements, which process information by their lively state response to external inputs.
- Neural networks are named after the cells in the human brain that perform intelligent operations.
- The brain is made up of billions of neuron cells. Each of these cells is like a tiny computer with extremely limited capabilities; however, connected together, these cells form the most intelligent system known.
- Neural networks are formed from hundreds or thousands of simulated neurons connected together in much the same way as the brain's neurons.
- Neural networks are typically organized in layers. Layers are made up of a number of interconnected 'nodes' which contain an 'activation function'.
- Patterns are presented to the network via the 'input layer', which communicates to one or more 'hidden layers' where the actual processing is done via a system of weighted 'connections'.
- The hidden layers then link to an 'output layer' where the answer is output as shown in the graphic below.
Back Propagation – Design Parameters
Employing a back propagation neural network requires an understanding of a number of network design options.
- These are the self-determining variables which must be adjusted to fall into a range of 0 to 1. The number of nodes is fixed by the number of inputs.
- Inputs must not be nominal scale, but can be binary, ordinal or better. Such inputs can be accommodated by providing a separate input node for each category which is associated with a binary (0 or 1) input.
- For the purpose of this research there’s always a single output – i.e., the output is adjusted to fall within the range of 0-1.
- The hidden layers allow a number of potentially different combinations of inputs that might results in high (or low) outputs.
- Each successive hidden layer represents the possibility of recognizing the importance of combinations of combinations.
- A neural network can perform tasks that a linear program cannot.
- When an element of the neural network fails, it can continue without any
problem by their comparable nature.
- A neural network learns and does not need to be reprogrammed.
- It can be implemented in any application and without any problem.
- The neural network needs training to operate.
- The architecture of a neural network is different from the architecture of microprocessors therefore needs to be emulated.
- Requires high processing time for large neural networks.
Real Time Examples
Neural networks have broad applicability to real world business problems. In fact, they have already been successfully applied in many industries. Since neural networks are best at identifying patterns or trends in data, they are well suited for prediction or forecasting needs including,
- Sales forecasting.
- Industrial process control.
- Customer research.
- Data Validation.
- Risk Management.
- Target marketing.