DSpace logo

Please use this identifier to cite or link to this item: http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16108
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBhatia, Ashutosh-
dc.date.accessioned2024-10-16T07:13:37Z-
dc.date.available2024-10-16T07:13:37Z-
dc.date.issued2018-
dc.identifier.urihttps://ieeexplore.ieee.org/abstract/document/8588788-
dc.identifier.urihttp://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/16108-
dc.description.abstractIt has been proven that Recurrent Neural Networks (RNNs) are Turing Complete, i.e. for any given computable function there exists a finite RNN to compute it. Consequently, researchers have trained Recurrent Neural Networks to learn simple functions like sorting, addition, compression and more recently, even classical cryptographic ciphers such as the Enigma. In this paper, we try to identify the characteristics of functions that make them easy or difficult for the RNN to learn. We look at functions from a cryptographic point of view by studying the ways in which the output depends on the input. We use cryptographic parameters (confusion and diffusion) for determining the strength of a cipher and quantify this dependence to show that a strong correlation exists between the learning capability of an RNN and the function's cryptographic parametersen_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.subjectComputer Scienceen_US
dc.subjectRecurrent neural networksen_US
dc.subjectCryptographic Ciphersen_US
dc.subjectConfusion Parameteren_US
dc.subjectDiffusion Parameteren_US
dc.titleOn the Learning Capabilities of Recurrent Neural Networks: A Cryptographic Perspectiveen_US
dc.typeArticleen_US
Appears in Collections:Department of Computer Science and Information Systems

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.