Please use this identifier to cite or link to this item: https://dspace.iiti.ac.in/handle/123456789/10401
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAnand, Samarthen_US
dc.contributor.authorKumpatla, Vijay Babuen_US
dc.contributor.authorAhuja, Kapil [Guide]en_US
dc.date.accessioned2022-07-05T12:29:14Z-
dc.date.available2022-07-05T12:29:14Z-
dc.date.issued2022-05-25-
dc.identifier.urihttps://dspace.iiti.ac.in/handle/123456789/10401-
dc.description.abstractDeep Learning is used to solve complex day to day problems. Solving of the machine learning tasks requires making use of large size DNNs. But to the sheer size and computation cost associated with them, it is very difficult to deploy these DNN models into the embedded systems.The scale of deep neural network models grows, requiring more computing and memory space. As a result, these models may be implemented on low-power devices with some approximation while maintaining network accuracy. Many recent studies have focused on reducing the size and complexity of these DNNs using various techniques. Quantisation of the DNN model parameters is one such technique that focuses on reducing the sheer size of the model by representing the high precision parameters in some lower bid-width representation. In this thesis we will try to look upon some of the efficient quantisation techniques that does not require any retraining/fine-tuning of the model post quantisation. We will also try to reduce the quantisation induced error by making use of Sampling Techniques.en_US
dc.language.isoenen_US
dc.publisherDepartment of Computer Science and Engineering, IIT Indoreen_US
dc.relation.ispartofseriesBTP594;CSE 2022 ANA-
dc.subjectComputer Science and Engineeringen_US
dc.titleApproximate deep learningen_US
dc.typeB.Tech Projecten_US
Appears in Collections:Department of Computer Science and Engineering_BTP

Files in This Item:
File Description SizeFormat 
BTP_594_Samarth_Anand_180001046_Vijay_Babu_Kumpatla_180001027.pdf2.15 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetric Badge: