Bachelorarbeit, 2005
42 Seiten, Note: 2,0
1. Introduction
2. Neural Networks
2.1. What are neural networks?
2.2 Biological background
2.3 General structure of neural networks
2.4 Properties of neural networks
2.5 Learning in neural networks and weighting
2.6 Historical overview
2.7 Different networks models
2.7.1 Single-layer feed-forward networks
2.7.2 Multi-layer feed-forward networks
2.7.3 Recurrent networks
3. MATLAB
3.1 General overview
3.2. MATLAB
3.3 Matrixes in MATLAB
4. Realization of neural networks in MATLAB
4.1 Creation of neural networks with the Network/Data Manager
4.2 Import of data
4.3 The implementation of an ADALINE network
4.4 The implementation of a back-propagation network
4.5 Self-optimizing neural networks
4.6 Hopfield network
4.7 Summary
5. Conclusion
The primary objective of this bachelor thesis is to provide a practical manual for implementing various types of artificial neural networks using the MATLAB software environment, serving as a supplement for a Master-level course.
2.3 General structure of neural networks
Neural networks consist of simple processing elements, which are in a high degree interlinked. These units are very much idealized neurons, consisting, like there biological originals, of three parts: a cell body, dendrites and an axon. The figure 2.3.1 shows a part of a biological and an artificial neuron in comparison.
Neurons “communicate” via simple scalar messages between the elements and neural networks are able to interact adaptively between single units. In a more general way one can say that neural networks are a simplification of the parallel structural design of animal brains.
Every element of a network can be interlinked with many other elements. This is the reason, why network structures can be also very intricate.
1. Introduction: Presents the thesis as a manual for implementing neural networks in MATLAB, outlining the scope and the decision to focus on practical application rather than complex mathematical proofs.
2. Neural Networks: Provides a comprehensive overview of biological foundations, the general structure of artificial networks, historical milestones, and classifications of common neural network models.
3. MATLAB: Explains the basic functions of the MATLAB environment, specifically focusing on matrix manipulation, interface usage, and commands relevant to supporting neural network implementations.
4. Realization of neural networks in MATLAB: Details the practical implementation of several neural network architectures using the MATLAB Network/Data Manager, including data import procedures and specific training examples.
5. Conclusion: Summarizes the thesis, highlighting that while MATLAB is a powerful tool for initial neural network implementation, deeper complexities may require further advanced programming beyond the scope of the manual.
Neural networks, MATLAB, Artificial Intelligence, Back-propagation, ADALINE, Hopfield network, Self-organizing maps, Neural Network Toolbox, Pattern recognition, Learning algorithms, Data implementation, Matrix manipulation, Supervised learning, Unsupervised learning, Network topology.
The thesis focuses on providing a practical guide for implementing artificial neural networks within the MATLAB software environment for educational purposes.
The work covers neural network theory, MATLAB fundamentals, network structure classification, and practical implementation procedures using the software's graphical user interface.
The objective is to document how various neural network models can be successfully created and trained using the "Neural Network Toolbox" in MATLAB.
The author uses an experimental and descriptive approach, moving from theoretical basics to step-by-step practical implementation of specific network types.
The main part covers the setup of the MATLAB environment, importing training data, configuring different network architectures, and evaluating training performance through graphical output.
Core keywords include Neural Networks, MATLAB, Artificial Intelligence, Back-propagation, ADALINE, and Hopfield Networks.
It is used as the primary interface for creating, initializing, and training neural networks, avoiding the need for deep manual coding of every function.
The author distinguishes them based on their ability to solve linear versus nonlinear problems, noting that back-propagation is required for more complex tasks like the XOR function.
It serves as an example of a dynamic, recursive network that operates differently from feed-forward models, highlighting the versatility of neural architecture.
Der GRIN Verlag hat sich seit 1998 auf die Veröffentlichung akademischer eBooks und Bücher spezialisiert. Der GRIN Verlag steht damit als erstes Unternehmen für User Generated Quality Content. Die Verlagsseiten GRIN.com, Hausarbeiten.de und Diplomarbeiten24 bieten für Hochschullehrer, Absolventen und Studenten die ideale Plattform, wissenschaftliche Texte wie Hausarbeiten, Referate, Bachelorarbeiten, Masterarbeiten, Diplomarbeiten, Dissertationen und wissenschaftliche Aufsätze einem breiten Publikum zu präsentieren.
Kostenfreie Veröffentlichung: Hausarbeit, Bachelorarbeit, Diplomarbeit, Dissertation, Masterarbeit, Interpretation oder Referat jetzt veröffentlichen!

