Work is proceeding on a LinkedIn Learning course on building AI apps. Even before all of the OpenAI drama, I had decided to focus on the ability to work locally, without relying on cloud or external APIs for embeddings or the LLM itself.
Some of the things I’m showing off end up being rather obscure–in the sense that there are NO existing examples out there of someone doing the same. Both LangChain and LlamaIndex have various ways to chat-with-their-own-docs, which I’ve been heavily consulting, as well as checking in with individuals.
I want the code in this course to be a shining example of how to build these apps.
If this sounds appealing to you and you’re able to review some code, reach out and I’ll get you set up.
If you have burning questions about building AI apps, I’d also love to hear them, and I’ll see if I can address them in the course.
Use the About link at the top of the blog for contact info.