Listen to this story
Andrew Ng’s DeepLearning.AI, in partnership with Amazon Web Services (AWS), has announced an exciting new course on Coursera called “Generative AI with Large Language Models” to address the growing demand for expertise in this field.
By enrolling in this course, participants will gain a comprehensive understanding of the generative AI lifecycle based on LLMs and the underlying transformer architecture that powers them. They will learn how to effectively utilise LLMs for various tasks by selecting the most suitable model and implementing appropriate training techniques.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Apart from Andrew Ng, the instructors include Antje Barth, principal developer advocate at AWS; Chris Fregly, principal solutions architect at AWS, Shelbee Eigenbrode, principal solutions architect at AWS, and Mike Chambers, developer advocate at AWS.
The course will also cover cutting-edge methods for training, fine-tuning, inference, and deployment of models, ensuring optimal performance in real-world scenarios. Additionally, learners will acquire essential skills to navigate the evolving landscape of generative AI and effectively integrate it into their organisations and products. The course includes:
- Data gathering: Collecting relevant data for training the generative AI model.
- Model selection: Choosing the appropriate model architecture for the task.
- Performance evaluation: Assessing the quality and effectiveness of the generated outputs.
- Deployment: Implementing the generative AI model in a real-world setting.
- Provide a detailed description of the transformer architecture that powers LLMs.
- Explain how LLMs are trained using the transformer architecture.
- Discuss how fine-tuning allows LLMs to be adapted to specific use cases.
- Utilise empirical scaling laws to optimise the model’s objective function.
- Optimise the objective function based on factors such as dataset size, compute budget, and inference requirements.
- Apply state-of-the-art training, tuning, inference, tools, and deployment methods.
- Use advanced techniques and tools to maximise the performance of generative AI models.
- Consider the specific constraints and requirements of the project.
As businesses adapt to leverage the power of generative AI, the associated complexities and uncertainties surrounding this technology have become inevitable. Andrew Ng said that this course aims to demystify the subject and equip learners with the knowledge and skills required to confidently harness the potential of LLMs in their endeavours.
Andrew Ng has been very vocal about promoting people to learn and adapt to generative AI. Earlier this month, he also released three new generative AI courses with LangChain, OpenAI, and Lamini. This is after releasing a course for prompt engineering in April in partnership with OpenAI.
In December 2022, DeepLearning.AI also introduced Mathematics for Machine Learning and Data Science Specialization Course, a beginner level mathematics course for AI.