17,95 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in 1-2 Wochen
payback
0 °P sammeln
  • Broschiertes Buch

Seminar paper from the year 2018 in the subject Engineering - Computer Engineering, grade: 1.0, University of Paderborn, language: English, abstract: Convolutional Neuronal Nets (CNNs) are state-of-the art Neuronal Networks, which are used in many fields like video analysis, face detection or image classification. Due to high requirements regarding computational resources and memory bandwidth, CNNs are mainly executed on special accelerator hardware which is more powerful and energy efficient than general purpose processors. This paper will give an overview of the usage of FPGAs for the…mehr

Produktbeschreibung
Seminar paper from the year 2018 in the subject Engineering - Computer Engineering, grade: 1.0, University of Paderborn, language: English, abstract: Convolutional Neuronal Nets (CNNs) are state-of-the art Neuronal Networks, which are used in many fields like video analysis, face detection or image classification. Due to high requirements regarding computational resources and memory bandwidth, CNNs are mainly executed on special accelerator hardware which is more powerful and energy efficient than general purpose processors. This paper will give an overview of the usage of FPGAs for the acceleration of computation intensive CNNs with OpenCL, proposing two different implementation alternatives. The first approach is based on nested loops, which are inspired by the mathematical formula of multidimensional convolutions. The second strategy transforms the computational problem into a matrix multiplication problem on the fly. The approaches are followed by common optimization techniques used for FPGA designs based on high level synthesis (HLS). Afterwards, the proposed implementations are compared to a CNN implementation on an Intel Xeon CPU in order to demonstrate the advantages in terms of performance and energy efficiency.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.