御風翱翔 知識漫遊

Wandering through Knowledge

Pytorch筆記: Quantization Aware Training (QAT)

Note of Quantization Aware Training (QAT) in Pytorch

pytorch作quantize運算有兩種backend選擇,分別為fbgemm以及qnnpack,分別對應x86和ARM