Build a Web Scrapping AI agent with Llama-3 Running Locally (100% free and without internet)
1. Install the necessary Python Libraries Run the following command from your terminal 2. Download @ollama and pull the following models: • Llama-3 as the main LLM • nomic-embed-text as the embedding model 3. Check that Ollama is running at localhost port 11434. If not you can try serving the model with...