Ever since the debut of ChatGPT powered by GPT-3.5 in December, there has been an emergence of several significant language models, including LLaMA, Llama 2, PaLM 2, GPT-4, Alpaca, Vicuna-13B, and more. Excitingly, this trend is poised to continue expanding. To fully harness the immense potential of generative AI and create customised models, these complimentary training programs will prove invaluable.
Generative AI for Everyone
Andrew Ng’s “Generative AI for Everyone,” is an in-depth course on generative AI. It covers its capabilities and boundaries, including practical exercises for everyday use, prompt engineering, and advanced AI techniques. The course focuses on real-world applications, showing generative AI’s common uses and allowing hands-on experience. It also discusses AI’s effect on business and society, preparing students to develop effective AI strategies and understand its real-world implications.
Introduction to LLMs
Google, which is currently building the next LLM Gemini, also provides a series of free courses on master generative AI. In this course, Google provides an overview of LLMs, and their definitions and explains potential applications.
It also delves into the concept of prompt engineering, which can improve the performance of LLMs. Additionally, the module introduces various Google tools that can assist in the development of personalised generative AI applications.
LLMs Application through Production
This course by Databricks is designed for individuals with an intermediate-level proficiency in Python and a working understanding of machine learning and deep learning. It focuses on the practical application of LLMs through various frameworks.
Participants will learn to build LLM-focused applications using popular libraries like Hugging Face and LangChain. The curriculum covers key concepts, including the distinctions between pre-training, fine-tuning, and prompt engineering. Industry experts, including Matei Zaharia (CTO Databricks and Associate Professor of CS at Stanford) and Harrison Chase (Co-Founder and CEO of LangChain), provide insightful lectures.
By the course’s conclusion, participants will have constructed an end-to-end LLM workflow, prepared for production deployment. This course is available for free audit, which means that you can access the course materials without paying any fee. However, if you want to access a managed compute environment for course labs, graded exercises, and a certificate, you need to pay a nominal fee.
Introduction to AI with Python
The introductory free online course “CS50’s Introduction to AI with Python” offered by Harvard School of Engineering and Applied Sciences covers fundamental concepts and algorithms in AI and ML using Python. It spans seven weeks, with a flexible time commitment of 10-30 hours per week and is self-paced.
Participants will explore various topics, including graph search algorithms, reinforcement learning, machine learning, and principles of artificial intelligence. The instructors, professors David J. Malan and Brian Yu, both from Harvard University, guide students through hands-on projects to reinforce theoretical knowledge.
ChatGPT Prompt Engineering for Developers
Again created by DeepLearning.AI, this free course is in partnership with OpenAI and is taught by Isa Fulford (OpenAI) and Andrew Ng. It covers the essentials of prompt engineering for developers, from beginner to advanced levels. Participants will learn to effectively use LLMs, specifically using the OpenAI API, enabling them to swiftly develop innovative applications.
The course explains LLM functionality, offers prompt engineering best practices, and demonstrates the practical application of LLM APIs in various tasks such as summarising, inferring, transforming text, and expanding content.
It also imparts two key principles for crafting effective prompts, systematic prompt engineering, and the creation of custom chatbots. The content is beginner-friendly, requiring only basic Python knowledge, while also catering to advanced machine learning engineers seeking cutting-edge insights into prompt engineering and LLM usage.
Career Essentials in Generative AI
Microsoft and LinkedIn have come up with a free course titled “Career Essentials in Generative AI,” which serves as an introductory resource on generative AI and its practical applications. The course aims to furnish learners with fundamental skills essential for success in the generative AI domain.
It covers key areas such as an overview of AI tools, an exploration of generative AI models and their functioning, differentiation between search engines and reasoning engines within the context of generative AI, using Microsoft Bing Chat for efficient work processes, a preview of Microsoft 365 Copilot, and a segment on the ethical considerations inherent in the generative AI creation and deployment process.
The free course is structured to provide a comprehensive understanding of the subject matter, encompassing both technical aspects and ethical dimensions.
ChatGPT Prompt Book
The ChatGPT Prompt Book consists of over 300 unique writing prompts generated by the ChatGPT language model, designed for creative thinking and finding new ideas and perspectives. The prompts cover a diverse range of subjects and are adaptable to various writing styles and genres, making them useful for writers of all levels of experience, from beginners to professionals.
Generative AI Foundations on AWS
The Generative AI Foundations on AWS is an eight hour course on YouTube offering practical insights and hands-on guidance for pre-training, fine-tuning, and deploying foundational models on AWS. It emphasises breaking down theory, mathematics, and abstract concepts, providing hands-on exercises to build practical intuition. The course progressively explores complex generative AI techniques, enabling participants to understand, design, and apply their models effectively.
Topics include the recap of foundation models, selecting the right model for specific use cases, pre-training new models, scaling laws for the model, dataset, and compute sizes, preparing training datasets on AWS at scale, fine-tuning models, and leveraging reinforcement learning with human feedback.
Read more: Harish Sivaramakrishnan: The Creative Pulse of CRED