COURSE UNIT TITLE

: DEEP LEARNING AND CONVOLUTIONAL NEURAL NETWORKS

Description of Individual Course Units

Course Unit Code Course Unit Title Type Of Course D U L ECTS
CSC 5020 DEEP LEARNING AND CONVOLUTIONAL NEURAL NETWORKS ELECTIVE 3 0 0 8

Offered By

Graduate School of Natural and Applied Sciences

Level of Course Unit

Second Cycle Programmes (Master's Degree)

Course Coordinator

ASISTANT PROFESSOR METE EMINAĞAOĞLU

Offered to

Computer Science
Ph.D. in Computer Science

Course Objective

To provide the students with a comprehensive theoretical and applied study of; deep learning, convolutional neural networks and relevant advanced topics in machine learning and artificial intelligence. To establish in-depth knowledge of deep hierarchies and learning mechanisms in humans, deep vs. shallow architectures, restricted Boltzmann Machines, deep belief networks and their applications to pattern recognition, speech recognition and natural language processing.

Learning Outcomes of the Course Unit

1   Implement and develop solutions for problems in companies or institutions by the aid of deep learning and convolutional neural networks.
2   Gain knowledge to effectively design and implement convolutional neural network models for natural language processing and text mining.
3   Gain knowledge to effectively design and implement deep learning models for pattern recognition and machine vision.
4   Plan, manage and use different methodologies, procedures and techniques in deep learning and convolutional neural networks.
5   Develop or implement research projects in the area of deep learning and convolutional neural networks.

Mode of Delivery

Face -to- Face

Prerequisites and Co-requisites

None

Recomended Optional Programme Components

None

Course Contents

Week Subject Description
1 Introduction to Artificial Neural Networks. Biological neuron, Artificial neuron models, Perceptron learning, Multi-layer perceptron learning, Backpropagation. Supervised, semi-supervised and unsupervised learning.
2 Basics of Artificial Neural Networks. Gradient descent, Stochastic gradient descent, other optimization methods, Issues of convergence, over fitting, generalization.
3 Introduction to computer vision and pattern recognition. Hierarchies in human vision. The approaches using engineered-features, sparse encoding, multiple hierarchies, levels of the hierarchy and the responsibilities.
4 Introduction to Deep Learning. Auto-encoders. Learning representations with auto-encoders, influence of sparsity and issues of sparse data.
5 Convolutional Neural Networks. Basic theoretical concepts. Different layers of processing, convolution, pooling, drop-out, loss, training.
6 Applications of Convolutional Neural Networks.
7 Restricted Boltzmann Machines. Basic theoretical concepts. Hopfield Networks, Boltzmann Machines, training models.
8 Applications of Restricted Boltzmann Machines and Boltzmann Machines.
9 Deep Belief Networks. Theoretical background. Applications of Deep Belief Networks.
10 Deep Recurrent Networks and Sequence Learning. Simple Recurrent NN: Elman and Jordan networks. Standard Recurrent NN, Long/Short Term Memory.
11 Applications of Deep Recurrent Networks.
12 Alternative Deep Learning Models: Deep Reinforcement Learning, Neural Turing Machines.
13 Deep vs. shallow learning. Advanced problems and current issues in deep learning and convolutional neural networks.
14 Project demos / presentations. General discussion and review of the topics covered throughout the term.

Recomended or Required Reading

Y. Bengio, I. Goodfellow and A. Courville, Deep Learning , MIT Press, 2016.
L. Deng and D. Yu, Deep Learning: Methods and Applications , Foundations and Trends in Signal Processing, Now Publishers, 2014.

Supplementary Book(s):
C. Bishop, Pattern Recognition and Machine Learning , Springer, 2006.
K. P. Murphy, Machine learning: a probabilistic perspective , MIT press, 2012.

Planned Learning Activities and Teaching Methods

The course is taught in a lecture, class presentation and discussion format. Besides the taught lecture, group presentations are to be prepared by the groups assigned and presented in a discussion session. In some weeks of the course, results of the homework given previously are discussed.

Assessment Methods

SORTING NUMBER SHORT CODE LONG CODE FORMULA
1 MTE MIDTERM EXAM
2 ASG ASSIGNMENT
3 FIN FINAL EXAM
4 FCG FINAL COURSE GRADE MTE * 0.20 + ASG * 0.40 + FIN * 0.40
5 RST RESIT
6 FCGR FINAL COURSE GRADE (RESIT) MTE * 0.20 + ASG * 0.40 + RST * 0.40


Further Notes About Assessment Methods

None.

Assessment Criteria

None.

Language of Instruction

English

Course Policies and Rules

To be announced.

Contact Details for the Lecturer(s)

mete.eminagaoglu@deu.edu.tr

Office Hours

Will be announced.

Work Placement(s)

None

Workload Calculation

Activities Number Time (hours) Total Work Load (hours)
Lectures 14 3 42
Preparations before/after weekly lectures 13 4 52
Preparation for final exam 1 24 24
Preparing assignments 2 30 60
Preparing presentations 2 14 28
Final 1 2 2
TOTAL WORKLOAD (hours) 208

Contribution of Learning Outcomes to Programme Outcomes

PO/LOPO.1PO.2PO.3PO.4PO.5PO.6PO.7PO.8PO.9PO.10
LO.14444
LO.24444
LO.34444
LO.44444
LO.54444