Talk on Artificial Intelligence, Deep Neural Networks and Deep Learning

Get monthly LoyJoy News with Product Updates & Success Stories.

Unsubscribe anytime.

We use rapidmail to send our newsletter. With your registration you agree that the data entered will be transmitted to rapidmail will be transmitted. Please also note the Terms and Conditions and Privacy .

An advertising banner for a lecture is shown. The title is: In one hour fit for AI entry. Introduction AI + Deep Learning. Research - breakthroughs, technical foundations and implementation.

Talk on Artificial Intelligence, Deep Neural Networks and Deep Learning

On June 19, 2018 we gave a talk (in German) on artificial intelligence, deep neural networks and deep learning at the Technologiehof Münster in the Let’s Talk series of the Venture Club Münster.

Our talk is available on Youtube and starts with 20 minutes of recent research milestones on applications for deep neural networks. In the following 40 minutes we explain the underlying mathematical principles of deep neural networks and deep learning and introduce into implementations with Python and Tensorflow as well as Java and Deeplearning4j.

We regularly give talks on deep neural networks, deep learning, artificial intelligence and software as a service (SaaS), as they are at the heart of LoyJoy. If you have feedback or want to invite us to speak at your event, please lets us know. Special thanks go to the Technologieförderung Münster for hosting us!

Concepts covered:

  • An artificial neural network is a network of artificial neurons, loosely modeling a biological brain. It maps numerical inputs to outputs and can be trained for tasks such as computer vision, speech recognition and machine translation.

  • Deep learning is a machine learning method for training deep neural networks, i.e. artificial neural networks with multiple layers.

  • ImageNet is a visual database with several million of labeled images used for visual object recognition research.

  • A convolutional neural network (CNN) is a class of artificial neural networks inspired by the visual cortex. It is primarily used in computer vision, but also has applications in other areas such as natural language processing (NLP).

  • A recurrent neural network (RNN) is a class of artificial neural networks with feedback connections between layers. It can be applied to tasks constituted by sequences, such as handwriting recognition, speech recognition and machine translation. Affected by the vanishing gradient problem.

  • Long short-term memory (LSTM) is a specific type of RNN neurons or layers addressing the RNN vanishing gradient problem. Applications include grammar learning, machine translation, speech recognition and time series prediction.

  • Backpropagation is an algorithm for training deep neural networks. For a given input from the training data the output of the network is calculated and compared to the desired output from the training data. Based on this result and a loss function a gradient is calculated to update the weights of the deep neural network.

  • Stochastic gradient descent (SGD) is an iterative method for minimizing the loss function. Some extensions of SGD are momentum method, AdaGrad, RMSProp and Adam, which offer dynamic learning rates i.e. step sizes per iteration.

  • An activation function defines the output of a neuron depending on the inputs. It loosely models the biological action potential and has to be nonlinear for nontrivial problems.

The research papers and examples covered in the talk are:

Ready to give LoyJoy a Try?

Request Your Free Personalized Demo Now!