v13, page 48, table 10 caption > For quantization, we employ bitesandbytes to quantize the 16-bit models to 8/4 bits might be: > For quantization, we employ bitsandbytes to quantize the 16-bit models to 8/4 bits "bitsandbytes" instead of "bitesandbytes"