e2b-dev/fragments

每日信息看板 · 2026-02-15
开源项目
Category
github_search
Source
47
Score
2026-02-15T17:46:17Z
Published

AI 总结

E2B 发布了开源项目 Fragments,可复刻 Claude Artifacts/v0 类 AI 代码生成应用并安全执行代码,支持多技术栈与多模型提供商,重要性在于降低搭建可运行式 AI 编程产品的门槛与扩展成本。
#GitHub #repo #开源项目 #E2B #Fragments #Next.js #Vercel AI SDK

内容摘录

!E2B Fragments Preview Light
!E2B Fragments Preview Dark
Fragments by E2B

This is an open-source version of apps like Anthropic's Claude Artifacts, Vercel v0, or GPT Engineer.

Powered by the E2B SDK.

→ Try on fragments.e2b.dev
Features
Based on Next.js 14 (App Router, Server Actions), shadcn/ui, TailwindCSS, Vercel AI SDK.
Uses the E2B SDK by E2B to securely execute code generated by AI.
Streaming in the UI.
Can install and use any package from npm, pip.
Supported stacks (add your own):
🔸 Python interpreter
🔸 Next.js
🔸 Vue.js
🔸 Streamlit
🔸 Gradio
Supported LLM Providers (add your own):
🔸 OpenAI
🔸 Anthropic
🔸 Google AI
🔸 Mistral
🔸 Groq
🔸 Fireworks
🔸 Together AI
🔸 Ollama
Integrates with Morph Apply model for token efficient, accurate and faster code editing.

**Make sure to give us a star!**

<img width="165" alt="Screenshot 2024-04-20 at 22 13 32" src="https://github.com/mishushakov/llm-scraper/assets/10400064/11e2a79f-a835-48c4-9f85-5c104ca7bb49">
Get started
Prerequisites
git
Recent version of Node.js and npm package manager
E2B API Key
LLM Provider API Key
Clone the repository

In your terminal:
Install the dependencies

Enter the repository:

Run the following to install the required dependencies:
Set the environment variables

Create a .env.local file and set the following:
Start the development server
Build the web app
Customize
Adding custom personas
Make sure E2B CLI is installed and you're logged in.
Add a new folder under sandbox-templates/
Initialize a new template using E2B CLI:

 

 This will create a new file called e2b.Dockerfile.
Adjust the e2b.Dockerfile

 Here's an example streamlit template:
Specify a custom start command in e2b.toml:
Deploy the template with the E2B CLI

 

 After the build has finished, you should get the following message:
Open lib/templates.json in your code editor.

 Add your new template to the list. Here's an example for Streamlit:

 

 Provide a template id (as key), name, list of dependencies, entrypoint and a port (optional). You can also add additional instructions that will be given to the LLM.
Optionally, add a new logo under public/thirdparty/templates
Adding custom LLM models
Open lib/models.json in your code editor.
Add a new entry to the models list:

 

 Where id is the model id, name is the model name (visible in the UI), provider is the provider name and providerId is the provider tag (see adding providers below).
Adding custom LLM providers
Open lib/models.ts in your code editor.
Add a new entry to the providerConfigs list:

 Example for fireworks:
Optionally, adjust the default structured output mode in the getDefaultMode function:
Optionally, add a new logo under public/thirdparty/logos
Contributing

As an open-source project, we welcome contributions from the community. If you are experiencing any bugs or want to add some improvements, please feel free to open an issue or pull request.