zaoyang
@zaoyang
Owner of \newline and previously co-creator of Farmville (200M users, $3B revenue) and Kaspa ($3B market cap). Self-taught in gaming, crypto, deep learning, now generative AI. Newline is used by 250,000+ professionals from Salesforce, Adobe, Disney, Amazon, and more. Newline has built editorial tools using LLMs, article generation using reinforcement learning and LLMs, instructor outreach tools. Newline is currently building generative AI products that will be announced soon.
Contributed IDE list
No IDE contribution for now
User's comments
0
courses
Power AI course
Advanced Context Engineering is the spine of this course: you are not just learning prompts, you are learning a full context stack that runs from tokens and embeddings through attention, synthetic data, evaluation, and RAG. We start at the numerical layer (vectors, tensors, neural networks, attention) then climb up through tokens, embeddings, multimodal representations, and transformer internals so you see exactly how context is encoded, routed, and used for prediction. On top of that foundation you learn how to design prompts as controllable programs, generate and curate synthetic data, apply axial coding and LLM-as-judge evaluation to detect failure patterns, and build RAG systems that act as real search indices across vector databases, APIs, SQL, and the web. By the end, you have a rare, end-to-end context engineering skill set: you know how to shape model behavior with prompts, stress-test it with synthetic data, debug it with evaluations, and wrap it in advanced RAG so that you can plug everything into a full-stack AI system for real products. This depth and sequencing of content is usually scattered across research papers, internal company docs, and niche talks, but here it is integrated into one coherent stack that is very hard to find in a single course.Nov 27th 2025AI Bootcamp
Everyone’s heard of ChatGPT, but what truly powers these modern large language models? It all starts with the transformer architecture. This bootcamp demystifies LLMs, taking you from concept to code and giving you a full, hands-on understanding of how transformers work. You’ll gain intuitive insights into the core components—autoregressive decoding, multi-head attention, and more—while bridging theory, math, and code. By the end, you’ll be ready to understand, build, and optimize LLMs, with the skills to read research papers, evaluate models, and confidently tackle ML interviews.Jul 11th 2025AI bootcamp 2
This advanced AI Bootcamp teaches you to design, debug, and optimize full-stack AI systems that adapt over time. You will master byte-level models, advanced decoding, and RAG architectures that integrate text, images, tables, and structured data. You will learn multi-vector indexing, late interaction, and reinforcement learning techniques like DPO, PPO, and verifier-guided feedback. Through 50+ hands-on labs using Hugging Face, DSPy, LangChain, and OpenPipe, you will graduate able to architect, deploy, and evolve enterprise-grade AI pipelines with precision and scalability.Aug 12th 2025books
zaoyang hasn't published any books