Reading the LLaMA code

The LLaMA and LLaMA 2 model released by Meta/Facebook is available on GitHub and there’s a guide to help you using it. Of course, from Meta, this model is using PyTorch. But surprisingly, the repo on GitHub is very short that you can read and understand it in a day... [more]

Self-hosted Copilot for Your VSCode

GitHub has its Copilot service that we can pay a subscription for. It is a coding assistant in your IDE, which requires a plugin on your editor and performs auto-complete for the code you typed. There are off-the-shelf language models that can generate code like GitHub’s Copilot. But the model... [more]

Conda and CUDA

If we want to run TensorFlow or PyTorch with CUDA on Linux, for example, we can install CUDA as a system library first and then install the Python package with pip (or via apt-get, in the rare case). This way, the package will find the CUDA library at system locations.... [more]