Tech
Qodo / Stackblitz
Company
Qodo / Stackblitz
Title
Scaling AI-Powered Code Generation in Browser and Enterprise Environments
Industry
Tech
Year
2024
Summary (short)
The case study examines two companies' approaches to deploying LLMs for code generation at scale: Stackblitz's Bolt.new achieving over $8M ARR in 2 months with their browser-based development environment, and Qodo's enterprise-focused solution handling complex deployment scenarios across 96 different configurations. Both companies demonstrate different approaches to productionizing LLMs, with Bolt.new focusing on simplified web app development for non-developers and Qodo targeting enterprise testing and code review workflows.
This case study examines two different approaches to deploying LLMs for code generation at scale, highlighting the challenges and solutions in both consumer and enterprise environments. Stackblitz's Bolt.new represents a breakthrough in browser-based AI code generation, achieving remarkable growth with over $8M ARR within two months of launch. The key to their success lies in their WebContainer technology, a custom-built operating system that runs entirely in the browser. Unlike traditional approaches that try to convert Docker containers to WebAssembly, Stackblitz wrote their own stripped-down OS specifically for browser environments, resulting in a dramatically smaller footprint (around 1MB compared to 60-100MB for Docker conversions). The LLMOps architecture of Bolt.new is built around providing maximum context to the LLM while maintaining efficiency. Their approach differs from competitors by: * Instrumenting their WebContainer OS at multiple levels (process, runtime, etc.) to capture detailed error information * Using this rich context to help the LLM understand the full state of the application * Implementing a sophisticated error handling system that can automatically detect and fix issues * Breaking down complex tasks into smaller, more manageable pieces for the LLM to handle Their pricing model evolved to reflect the computational demands of providing this rich context, with tiers ranging from $9 to $200 per month, plus usage-based billing for additional tokens. This demonstrates how LLMOps capabilities can directly translate to business value when properly productized. Qodo takes a different approach, focusing on enterprise deployments with their code testing and review solution. Their LLMOps challenges center around handling extreme deployment flexibility, supporting: * Multiple code management systems (GitHub, GitLab, Subversion) in both cloud and on-premises configurations * Various model deployment options (AWS Bedrock, Azure OpenAI, on-premises GPUs) * Different networking configurations and security requirements * Integration with existing development workflows and tools Qodo has developed multiple specialized models for different tasks: * A code autocomplete model * A chat model * A dedicated code review model * A code embedding model Their approach to LLMOps includes sophisticated flow engineering techniques, breaking down complex problems into smaller tasks to improve model performance. This is exemplified in their AlphaCodium system, which achieved better performance than OpenAI's code models by implementing a structured problem-solving approach: * Initial problem reasoning and breakdown * Analysis of test cases * Generation of multiple potential solutions * Creation of diverse test cases * Iterative improvement based on test results Both companies demonstrate different aspects of successful LLMOps deployments. Stackblitz shows how focusing on a specific use case (browser-based development) and optimizing the entire stack can create a compelling product, while Qodo illustrates how to handle the complexity of enterprise deployments with multiple models and deployment scenarios. The case study highlights several key LLMOps lessons: * The importance of context management and error handling in production LLM systems * How breaking down complex tasks can improve model performance * The value of specialized models for different aspects of the development workflow * The need for flexible deployment options in enterprise environments * The importance of integration with existing development tools and workflows The companies take different approaches to open source, with Stackblitz open-sourcing core components of Bolt while keeping their WebContainer technology proprietary, and Qodo maintaining both open source and commercial versions of their tools. This highlights the balance between community engagement and commercial success in LLMOps products. Both companies emphasize the importance of real-world validation and testing, with Stackblitz focusing on user feedback and iteration, while Qodo implements sophisticated testing frameworks and allows enterprise customers to influence their indexing and best practices. The case study demonstrates how LLMOps is maturing beyond simple model deployment to encompass entire development workflows, with successful companies building sophisticated infrastructure to handle everything from error management to enterprise deployment scenarios.

Start your new ML Project today with ZenML Pro

Join 1,000s of members already deploying models with ZenML.