top of page

Deep Evolutionary Learning (DEL)

Discover with ML4DeepSense  

Need for better learning algorithm for Deep Learning:

Limitations of Backpropagation:

  • It is slow, all previous layers are locked until gradients for the current layer is calculated (with Stocastic Steapest Descent)

  • It suffers from vanishing or exploding gradients problem,

  • It considers predicted value & actual value only to calculate error and to calculate gradients, related to objective function, partially related to the Backpropagation algorithm,

  • It doesn’t consider the spatial, associative and dis-associative relationship between classes while calculating errors, related to objective function, partially related to the Backpropagation algorithm,

  • The performance is effected with increasing noise and low contrast between objects (due to gradient calculations).

Advantages of Proposed Deep Evolutionary Learning (DEL):

  • ​DEL uses Evolutionary Strategies for adaptive step lenght control in objective function optimization during training

  • Avolutionary Strategies is used for training with adaptive decision criteria temperature (T)

  • Due to these reasons there is no gradient calculations for layers

  • Unsupervised learning is used with feedforwared calculations

  • There is not much reduction in Learning time consuption

  • Performance is better

  • There is no noise effect to the training performance

  • Realisation even to complex topologies is easy.

Recognition of Low Contrast Objects with Histogram Oriented Activation Fuction and DEL for Deep CNN:  

Contacts
bottom of page