hi, I don't use debian based linux so I can't test this, but I had some nits for the website: 1. I would specify the OS dependencies (apt and systemd), and 2. I would display the actual script under the install command to make it easier to review what's going on (faster that clicking the small github link and then opening the relevant file)
Hey. I liked your feedback, so I updated the website.
You can give it a shot in VM - I've done testing on macOS, and then used Ubuntu 24.04 home box to install it, also in multipass. In GitHub I put howto.
The reliability of the information depends on two factors:
* The quality of project resources that you're working with. You use your actual textbooks from class, the quality would be higher. Some random articles online, the quality would be worse. You have complete control over the source material that the LLM uses as context.
* The specific model that you're working with. The system is model-agnostic, you can bring your favorite model.
On privacy, I plan to make this a local-first application where you can bring your own LLM API key from any provider, and the API calls are made from your own machine to your LLM provider. In addition to a 3rd party LLM provider, you can use a local model running on your machine for maximum privacy. This is also the easiest way to ensure user ownership of their educational material.