DSpace Repository

On the Learning Capabilities of Recurrent Neural Networks: A Cryptographic Perspective

Show simple item record

dc.contributor.author Bhatia, Ashutosh
dc.date.accessioned 2024-10-16T07:13:37Z
dc.date.available 2024-10-16T07:13:37Z
dc.date.issued 2018
dc.identifier.uri https://ieeexplore.ieee.org/abstract/document/8588788
dc.identifier.uri http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16108
dc.description.abstract It has been proven that Recurrent Neural Networks (RNNs) are Turing Complete, i.e. for any given computable function there exists a finite RNN to compute it. Consequently, researchers have trained Recurrent Neural Networks to learn simple functions like sorting, addition, compression and more recently, even classical cryptographic ciphers such as the Enigma. In this paper, we try to identify the characteristics of functions that make them easy or difficult for the RNN to learn. We look at functions from a cryptographic point of view by studying the ways in which the output depends on the input. We use cryptographic parameters (confusion and diffusion) for determining the strength of a cipher and quantify this dependence to show that a strong correlation exists between the learning capability of an RNN and the function's cryptographic parameters en_US
dc.language.iso en en_US
dc.publisher IEEE en_US
dc.subject Computer Science en_US
dc.subject Recurrent neural networks en_US
dc.subject Cryptographic Ciphers en_US
dc.subject Confusion Parameter en_US
dc.subject Diffusion Parameter en_US
dc.title On the Learning Capabilities of Recurrent Neural Networks: A Cryptographic Perspective en_US
dc.type Article en_US


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account