{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "-PxsodrwP58p" }, "source": [ "To run this, press \"*Runtime*\" and press \"*Run all*\" on a **free** Tesla T4 Google Colab instance!\n", "
" ] }, { "cell_type": "markdown", "metadata": { "id": "a25cKF9SR_4c" }, "source": [ "For more detailed usage information, please refer to our [cookbook](https://colab.research.google.com/drive/1fdBns2QA1XNwF_tsvG3Hc27QGdViHH3b?usp=sharing)" ] }, { "cell_type": "markdown", "metadata": { "id": "tQvt0YVXPaHO" }, "source": [ "### Agenetic SFT Data generation with CAMEL and finetuning Meta models with Unsloth\n", "\n", "CAMEL and Unsloth make an excellent pair. In this notebook we will combine the two to train a model to be proficient at content on a page\n", "\n", "You will learn how to do data generation with CAMEL, how to train, and how to run the model." ] }, { "cell_type": "markdown", "metadata": { "id": "Hf-ePAgCPdI6" }, "source": [ "" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "2eSvM9zX_2d3" }, "outputs": [], "source": [ "%%capture\n", "!pip install unsloth\n", "# Install CAMEL-AI with no optional dependencies\n", "!pip install camel-ai==0.2.14\n", "# Get Unsloth\n", "!pip install --upgrade --no-deps \"unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git@0de54572525788d09a6a9ef1efc7611e65dd7547\"\n", "!pip install firecrawl" ] }, { "cell_type": "markdown", "metadata": { "id": "r2v_X2fA0Df5" }, "source": [ "First we will set the OPENAI_API_KEY that will be used to generate the data.\n", "\n", "CAMEL supports many other models. See [here](https://docs.camel-ai.org/key_modules/models.html) for a list." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "gq4Q0u0ZTvGs", "outputId": "f80f1073-4c1d-4ec4-c154-665e841b95d8" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Enter your OpenAI API key: ··········\n", "Enter your Firecrawl API key: ··········\n" ] } ], "source": [ "from getpass import getpass\n", "import os\n", "\n", "openai_api_key = getpass('Enter your OpenAI API key: ')\n", "os.environ[\"OPENAI_API_KEY\"] = openai_api_key\n", "\n", "# Generate an API key at https://www.firecrawl.dev/app/api-keys\n", "firecrawl_api_key = getpass('Enter your Firecrawl API key: ')\n", "os.environ[\"FIRECRAWL_API_KEY\"] = firecrawl_api_key" ] }, { "cell_type": "markdown", "metadata": { "id": "iP5-hPz-0T6x" }, "source": [ "Next we will setup our model for training using Unsloth." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 441, "referenced_widgets": [ "09744e002e3e47448b5c807424bf6956", "92408a369da54dd7bcddf6f6e50e9647", "eecd38a827cd4e7e94759e2497ed471e", "a9cf69cbe2314ec2a2e928e30db6dbfa", "eba29435f8f2428d87cf757c343bf6e7", "db51939f8da94b58a21cd945efc1bd63", "8058a98caaa5479c827c762dcaecdc7a", "ae99567a191740bcb1676c4bc71b8d57", "07f5cb55a7914ba994acda44294db8ac", "17d37e474ba34bedaa89830f543e5809", "57a028baaf034aa7b5ec44e4566f1802", "0d53408d8ede4b699ad2cf635938451b", "721b011e999e4f118facc65914af1c87", "09717565e7a94331b8e3e9618f5eb3af", "bd8a573b5b7a46a2b9e3f5fad61730e2", "04e99d4f4d784289bb5e442fac2f447c", "a92a81934edb4fc19c4603cdf9489559", "39d3a5a6441447fb88c2713a6baa918d", "3f60fe4920424667acf9a0bd9ec403de", "a978e1e9f9344a17899a8839372e0104", "6e9e0866439c444eb0830ddb2c0a8f0b", "95ed6c2c3131498e93f341159647cd75", "12c81caf98ed4c4cac311a021162a564", "3867254073fb4e52b1a56fb07752aa1c", "17bf788adc944f4cab1c8bdb0bb64b8d", "6026736dd09744f3b52ae6390cd89194", "c4252aeb37914583a57284bf0c6af3fe", "44768e2247cc472f85fb4191c7bebb4a", "63e04fcb4c784acc9a8e906840ce5777", "6e57e5cdca8643ee9e23a46b9ccdffb0", "8a695679ac3446518d3e61b30bc532f4", "eedbe8cf22f343289451142ddfd25b2e", "9e3b451a55984289a9ba58106387e823", "92cb2cbe77294e45b27041112bc1d7c6", "79f048b5c736497caae92fdc894a9224", "ed2e51a68ba346b38f59e95a264b72ac", "381c46960ece4da7aca34180ed6593ff", "9afdd6b7bc354237843b524026223185", "8474b7c07b744960928212e6ec6605bc", "5c24251b8c85417eb738e8cc73ec4cec", "3b1c9d5fa7e84af28dc1603a9d62d794", "12a89422ee874ad39ca8144c310b4014", "bbd573562ae14387818f82ddb2ff3c40", "b71a8f4df6824510b70262b9d0538f89", "a4b5c9dc622f4c4b95418ab1983fb63d", "97744bb661a8491d881b66201717650d", "a7dd14f4c0874c27afda0bf9af750d2e", "30d16cafe76f4c74913c81f826d40208", "00307ddeb366496abcb81dada5ec01e6", "1dd1ed4aaf964078b26b597c5318dc8e", "41da75ce3a1042469aebb4615aaad93b", "9a80c47aadd140c3ba4c56eeb533ac77", "85803d76d70c4cabaa1f1ddc88c36ef3", "bc3b2868f4994714a887a889e3a2a18e", "55e8c726ecf548b9b4fe6d5248bfd149", "195b7fa1e3c546f5a93c10ec03b728a9", "1591d9b09c8443579e37db07dbad1224", "c238699464f84887a715fadf4ce1f220", "b0f411fbcc474290b0f087b5ea9e8f08", "95b6473d466944ebbcf5532c2ddb7558", "c08e4858f6eb4c0ebd5c5af58e0b7328", "c85d4cd76dbe42a4bd0613fd01ebf787", "71bd3e831d2f446cac805d6253136f4d", "7dd901a8afe34e5c82523b3079deedbb", "6c7c652bb3204ee5bbc43662668b41f1", "50e2bab66cf844689c5fddbe818fec69" ] }, "id": "QmUBVEnvCDJv", "outputId": "e741a69a-0c66-48a5-a27c-d1f994c0031a" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "🦥 Unsloth: Will patch your computer to enable 2x faster free finetuning.\n", "🦥 Unsloth Zoo will now patch everything to make training faster!\n", "==((====))== Unsloth 2024.11.6: Fast Llama patching. Transformers = 4.47.0.\n", " \\\\ /| GPU: Tesla T4. Max memory: 14.748 GB. Platform = Linux.\n", "O^O/ \\_/ \\ Pytorch: 2.5.1+cu121. CUDA = 7.5. CUDA Toolkit = 12.1.\n", "\\ / Bfloat16 = FALSE. FA [Xformers = 0.0.28.post3. FA2 = False]\n", " \"-____-\" Free Apache license: http://github.com/unslothai/unsloth\n", "Unsloth: Fast downloading is enabled - ignore downloading bars which are red colored!\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "Unsloth: unsloth/tinyllama-bnb-4bit can only handle sequence lengths of at most 2048.\n", "But with kaiokendev's RoPE scaling of 2.0, it can be magically be extended to 4096!\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "09744e002e3e47448b5c807424bf6956", "version_major": 2, "version_minor": 0 }, "text/plain": [ "model.safetensors: 0%| | 0.00/762M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "0d53408d8ede4b699ad2cf635938451b", "version_major": 2, "version_minor": 0 }, "text/plain": [ "generation_config.json: 0%| | 0.00/124 [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "12c81caf98ed4c4cac311a021162a564", "version_major": 2, "version_minor": 0 }, "text/plain": [ "tokenizer_config.json: 0%| | 0.00/948 [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "92cb2cbe77294e45b27041112bc1d7c6", "version_major": 2, "version_minor": 0 }, "text/plain": [ "tokenizer.model: 0%| | 0.00/500k [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "a4b5c9dc622f4c4b95418ab1983fb63d", "version_major": 2, "version_minor": 0 }, "text/plain": [ "special_tokens_map.json: 0%| | 0.00/438 [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "195b7fa1e3c546f5a93c10ec03b728a9", "version_major": 2, "version_minor": 0 }, "text/plain": [ "tokenizer.json: 0%| | 0.00/1.84M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stderr", "output_type": "stream", "text": [ "Unsloth 2024.11.6 patched 22 layers with 22 QKV layers, 22 O layers and 22 MLP layers.\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Unsloth: Training embed_tokens in mixed precision to save VRAM\n", "Unsloth: Training lm_head in mixed precision to save VRAM\n" ] } ], "source": [ "from unsloth import FastLanguageModel\n", "import torch\n", "max_seq_length = 4096\n", "dtype = None\n", "load_in_4bit = True\n", "\n", "model, tokenizer = FastLanguageModel.from_pretrained(\n", " model_name = \"unsloth/tinyllama-bnb-4bit\", # \"unsloth/tinyllama\" for 16bit loading\n", " max_seq_length = max_seq_length,\n", " dtype = dtype,\n", " load_in_4bit = load_in_4bit,\n", " # token = \"hf_...\", # use one if using gated models like meta-llama/Llama-2-7b-hf\n", ")\n", "model = FastLanguageModel.get_peft_model(\n", " model,\n", " r = 32,\n", " target_modules = [\"q_proj\", \"k_proj\", \"v_proj\", \"o_proj\",\n", " \"gate_proj\", \"up_proj\", \"down_proj\",\n", " \"embed_tokens\", \"lm_head\"],\n", " lora_alpha = 32,\n", " use_gradient_checkpointing = False, # @@@ IF YOU GET OUT OF MEMORY - set to True @@@\n", " random_state = 3407,\n", " use_rslora = False, # We support rank stabilized LoRA\n", " loftq_config = None, # And LoftQ\n", ")" ] }, { "cell_type": "markdown", "metadata": { "id": "RCb9quu7bcGE" }, "source": [ "Now as a control, lets see how this model does with our CAMEL-specific question" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "YqQz8vdrbbrF", "outputId": "76f54930-21ec-4935-ea7a-4bfe6dd44d01" }, "outputs": [ { "data": { "text/plain": [ "['Step | \n", "Training Loss | \n", "
---|---|
1 | \n", "15.460000 | \n", "
2 | \n", "15.468400 | \n", "
3 | \n", "15.018300 | \n", "
4 | \n", "14.925600 | \n", "
5 | \n", "15.390800 | \n", "
6 | \n", "13.078800 | \n", "
7 | \n", "10.951600 | \n", "
8 | \n", "8.418600 | \n", "
9 | \n", "6.773500 | \n", "
10 | \n", "5.550200 | \n", "
11 | \n", "4.810200 | \n", "
12 | \n", "4.147800 | \n", "
13 | \n", "4.007100 | \n", "
14 | \n", "3.928000 | \n", "
15 | \n", "2.667200 | \n", "
16 | \n", "2.650600 | \n", "
17 | \n", "2.499000 | \n", "
18 | \n", "2.663400 | \n", "
19 | \n", "2.547900 | \n", "
20 | \n", "2.235200 | \n", "
21 | \n", "2.532900 | \n", "
22 | \n", "1.376300 | \n", "
23 | \n", "1.290800 | \n", "
24 | \n", "1.362200 | \n", "
25 | \n", "1.455200 | \n", "
26 | \n", "1.527000 | \n", "
27 | \n", "1.653200 | \n", "
28 | \n", "1.605200 | \n", "
29 | \n", "0.948600 | \n", "
30 | \n", "1.200500 | \n", "
31 | \n", "1.140400 | \n", "
32 | \n", "1.221700 | \n", "
33 | \n", "1.306400 | \n", "
34 | \n", "1.352100 | \n", "
35 | \n", "1.342400 | \n", "
36 | \n", "0.980700 | \n", "
37 | \n", "1.019300 | \n", "
38 | \n", "0.905500 | \n", "
39 | \n", "1.008400 | \n", "
40 | \n", "1.245300 | \n", "
41 | \n", "1.109700 | \n", "
42 | \n", "1.453200 | \n", "
43 | \n", "0.797300 | \n", "
44 | \n", "1.013500 | \n", "
45 | \n", "0.786900 | \n", "
46 | \n", "0.894800 | \n", "
47 | \n", "1.177700 | \n", "
48 | \n", "1.086000 | \n", "
49 | \n", "1.651600 | \n", "
50 | \n", "0.895400 | \n", "
51 | \n", "1.011800 | \n", "
52 | \n", "1.160500 | \n", "
53 | \n", "1.573800 | \n", "
54 | \n", "1.163000 | \n", "
55 | \n", "1.072900 | \n", "
56 | \n", "1.035000 | \n", "
57 | \n", "0.611800 | \n", "
58 | \n", "0.743100 | \n", "
59 | \n", "1.096500 | \n", "
60 | \n", "1.139200 | \n", "
61 | \n", "5.855300 | \n", "
62 | \n", "1.619300 | \n", "
63 | \n", "2.743400 | \n", "
64 | \n", "6.773300 | \n", "
65 | \n", "1.551500 | \n", "
66 | \n", "1.018600 | \n", "
67 | \n", "1.284600 | \n", "
68 | \n", "1.117300 | \n", "
69 | \n", "1.509800 | \n", "
70 | \n", "1.074700 | \n", "
71 | \n", "0.755100 | \n", "
72 | \n", "0.497300 | \n", "
73 | \n", "0.875900 | \n", "
74 | \n", "0.597500 | \n", "
75 | \n", "0.563600 | \n", "
76 | \n", "5.032300 | \n", "
77 | \n", "0.779900 | \n", "
78 | \n", "0.679300 | \n", "
79 | \n", "0.577900 | \n", "
80 | \n", "0.581000 | \n", "
81 | \n", "0.407500 | \n", "
82 | \n", "0.902900 | \n", "
83 | \n", "0.644100 | \n", "
84 | \n", "1.133200 | \n", "
85 | \n", "1.693800 | \n", "
86 | \n", "0.898600 | \n", "
87 | \n", "0.657700 | \n", "
88 | \n", "1.592000 | \n", "
89 | \n", "0.733100 | \n", "
90 | \n", "4.917600 | \n", "
91 | \n", "10.263400 | \n", "
92 | \n", "15.912000 | \n", "
93 | \n", "1.027500 | \n", "
94 | \n", "0.480500 | \n", "
95 | \n", "0.616700 | \n", "
96 | \n", "0.356600 | \n", "
97 | \n", "0.444900 | \n", "
98 | \n", "0.373300 | \n", "
99 | \n", "0.169900 | \n", "
100 | \n", "0.421400 | \n", "
101 | \n", "0.390900 | \n", "
102 | \n", "0.608500 | \n", "
103 | \n", "0.467300 | \n", "
104 | \n", "0.441600 | \n", "
105 | \n", "0.422500 | \n", "
106 | \n", "0.222500 | \n", "
107 | \n", "5.019700 | \n", "
108 | \n", "0.546100 | \n", "
109 | \n", "0.521400 | \n", "
110 | \n", "0.753000 | \n", "
111 | \n", "0.584300 | \n", "
112 | \n", "0.837300 | \n", "
113 | \n", "0.472400 | \n", "
114 | \n", "0.333900 | \n", "
115 | \n", "0.428700 | \n", "
116 | \n", "0.467700 | \n", "
117 | \n", "0.369400 | \n", "
118 | \n", "0.344600 | \n", "
119 | \n", "0.395600 | \n", "
120 | \n", "0.357300 | \n", "
121 | \n", "0.300700 | \n", "
122 | \n", "0.325000 | \n", "
123 | \n", "1.147700 | \n", "
124 | \n", "0.571100 | \n", "
125 | \n", "0.716600 | \n", "
126 | \n", "0.876000 | \n", "
127 | \n", "0.438000 | \n", "
128 | \n", "0.252000 | \n", "
129 | \n", "0.387100 | \n", "
130 | \n", "0.275600 | \n", "
131 | \n", "0.732000 | \n", "
132 | \n", "0.538800 | \n", "
133 | \n", "0.491600 | \n", "
134 | \n", "0.258900 | \n", "
135 | \n", "0.256400 | \n", "
136 | \n", "0.289900 | \n", "
137 | \n", "0.243500 | \n", "
138 | \n", "0.282100 | \n", "
139 | \n", "0.436600 | \n", "
140 | \n", "0.329900 | \n", "
141 | \n", "0.192600 | \n", "
142 | \n", "0.283100 | \n", "
143 | \n", "0.248700 | \n", "
144 | \n", "0.236600 | \n", "
145 | \n", "0.392700 | \n", "
146 | \n", "0.541900 | \n", "
147 | \n", "0.308900 | \n", "
148 | \n", "0.215700 | \n", "
149 | \n", "0.292200 | \n", "
150 | \n", "0.280200 | \n", "
151 | \n", "0.228000 | \n", "
152 | \n", "0.215800 | \n", "
153 | \n", "0.183700 | \n", "
154 | \n", "0.360800 | \n", "
155 | \n", "0.106800 | \n", "
156 | \n", "0.110200 | \n", "
157 | \n", "0.172400 | \n", "
158 | \n", "0.151800 | \n", "
159 | \n", "0.337000 | \n", "
160 | \n", "0.206900 | \n", "
161 | \n", "0.350600 | \n", "
162 | \n", "0.095200 | \n", "
163 | \n", "0.127200 | \n", "
164 | \n", "0.146800 | \n", "
165 | \n", "0.191800 | \n", "
166 | \n", "0.278800 | \n", "
167 | \n", "0.203200 | \n", "
168 | \n", "0.193900 | \n", "
169 | \n", "0.062100 | \n", "
170 | \n", "0.158200 | \n", "
171 | \n", "0.173700 | \n", "
172 | \n", "0.276500 | \n", "
173 | \n", "0.247900 | \n", "
174 | \n", "0.246000 | \n", "
175 | \n", "0.177600 | \n", "
176 | \n", "0.166400 | \n", "
177 | \n", "0.153300 | \n", "
178 | \n", "0.104300 | \n", "
179 | \n", "0.082900 | \n", "
180 | \n", "0.120300 | \n", "
181 | \n", "0.193300 | \n", "
182 | \n", "0.158800 | \n", "
183 | \n", "0.101400 | \n", "
184 | \n", "0.051500 | \n", "
185 | \n", "0.065700 | \n", "
186 | \n", "0.492000 | \n", "
187 | \n", "0.102400 | \n", "
188 | \n", "0.196900 | \n", "
189 | \n", "0.202600 | \n", "
190 | \n", "0.115700 | \n", "
191 | \n", "0.079100 | \n", "
192 | \n", "0.115200 | \n", "
193 | \n", "0.182000 | \n", "
194 | \n", "0.114600 | \n", "
195 | \n", "0.132200 | \n", "
196 | \n", "0.135900 | \n", "
197 | \n", "0.063800 | \n", "
198 | \n", "0.049400 | \n", "
199 | \n", "0.056400 | \n", "
200 | \n", "0.060900 | \n", "
201 | \n", "0.111500 | \n", "
202 | \n", "0.079900 | \n", "
203 | \n", "0.120600 | \n", "
204 | \n", "0.034400 | \n", "
205 | \n", "0.044000 | \n", "
206 | \n", "0.088100 | \n", "
207 | \n", "0.041800 | \n", "
208 | \n", "0.055100 | \n", "
209 | \n", "0.067700 | \n", "
210 | \n", "0.084300 | \n", "
211 | \n", "0.018500 | \n", "
212 | \n", "0.031700 | \n", "
213 | \n", "0.023000 | \n", "
214 | \n", "0.049500 | \n", "
215 | \n", "0.050600 | \n", "
216 | \n", "0.071000 | \n", "
217 | \n", "0.064800 | \n", "
218 | \n", "0.034000 | \n", "
219 | \n", "0.025900 | \n", "
220 | \n", "0.028800 | \n", "
221 | \n", "0.017200 | \n", "
222 | \n", "0.040000 | \n", "
223 | \n", "0.047600 | \n", "
224 | \n", "0.057600 | \n", "
225 | \n", "0.013700 | \n", "
226 | \n", "0.008300 | \n", "
227 | \n", "0.027100 | \n", "
228 | \n", "0.016100 | \n", "
229 | \n", "0.017700 | \n", "
230 | \n", "0.068300 | \n", "
231 | \n", "0.017000 | \n", "
232 | \n", "0.010700 | \n", "
233 | \n", "0.010600 | \n", "
234 | \n", "0.009200 | \n", "
235 | \n", "0.011800 | \n", "
236 | \n", "0.005800 | \n", "
237 | \n", "0.022100 | \n", "
238 | \n", "0.025600 | \n", "
239 | \n", "0.018900 | \n", "
240 | \n", "0.010100 | \n", "
241 | \n", "0.003800 | \n", "
242 | \n", "0.005600 | \n", "
243 | \n", "0.007100 | \n", "
244 | \n", "0.012300 | \n", "
245 | \n", "0.008400 | \n", "
246 | \n", "0.005600 | \n", "
247 | \n", "0.006700 | \n", "
248 | \n", "0.007800 | \n", "
"
],
"text/plain": [
" "
]
},
"metadata": {}
}
],
"source": [
"dtrainer_stats = trainer.train()"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "ekOmTR1hSNcr"
},
"source": [
"\n",
"### Inference\n",
"Let's run the model! You can change the instruction and input - leave the output blank!"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"id": "kR3gIAX-SM2q",
"colab": {
"base_uri": "https://localhost:8080/"
},
"outputId": "e2186447-a801-4f5f-cf41-a7b91a8d9779"
},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"['\n",
" \n",
"
\n",
" \n",
" \n",
" \n",
" Step \n",
" Training Loss \n",
" \n",
" \n",
" 1 \n",
" 15.460000 \n",
" \n",
" \n",
" 2 \n",
" 15.468400 \n",
" \n",
" \n",
" 3 \n",
" 15.018300 \n",
" \n",
" \n",
" 4 \n",
" 14.925600 \n",
" \n",
" \n",
" 5 \n",
" 15.390800 \n",
" \n",
" \n",
" 6 \n",
" 13.078800 \n",
" \n",
" \n",
" 7 \n",
" 10.951600 \n",
" \n",
" \n",
" 8 \n",
" 8.418600 \n",
" \n",
" \n",
" 9 \n",
" 6.773500 \n",
" \n",
" \n",
" 10 \n",
" 5.550200 \n",
" \n",
" \n",
" 11 \n",
" 4.810200 \n",
" \n",
" \n",
" 12 \n",
" 4.147800 \n",
" \n",
" \n",
" 13 \n",
" 4.007100 \n",
" \n",
" \n",
" 14 \n",
" 3.928000 \n",
" \n",
" \n",
" 15 \n",
" 2.667200 \n",
" \n",
" \n",
" 16 \n",
" 2.650600 \n",
" \n",
" \n",
" 17 \n",
" 2.499000 \n",
" \n",
" \n",
" 18 \n",
" 2.663400 \n",
" \n",
" \n",
" 19 \n",
" 2.547900 \n",
" \n",
" \n",
" 20 \n",
" 2.235200 \n",
" \n",
" \n",
" 21 \n",
" 2.532900 \n",
" \n",
" \n",
" 22 \n",
" 1.376300 \n",
" \n",
" \n",
" 23 \n",
" 1.290800 \n",
" \n",
" \n",
" 24 \n",
" 1.362200 \n",
" \n",
" \n",
" 25 \n",
" 1.455200 \n",
" \n",
" \n",
" 26 \n",
" 1.527000 \n",
" \n",
" \n",
" 27 \n",
" 1.653200 \n",
" \n",
" \n",
" 28 \n",
" 1.605200 \n",
" \n",
" \n",
" 29 \n",
" 0.948600 \n",
" \n",
" \n",
" 30 \n",
" 1.200500 \n",
" \n",
" \n",
" 31 \n",
" 1.140400 \n",
" \n",
" \n",
" 32 \n",
" 1.221700 \n",
" \n",
" \n",
" 33 \n",
" 1.306400 \n",
" \n",
" \n",
" 34 \n",
" 1.352100 \n",
" \n",
" \n",
" 35 \n",
" 1.342400 \n",
" \n",
" \n",
" 36 \n",
" 0.980700 \n",
" \n",
" \n",
" 37 \n",
" 1.019300 \n",
" \n",
" \n",
" 38 \n",
" 0.905500 \n",
" \n",
" \n",
" 39 \n",
" 1.008400 \n",
" \n",
" \n",
" 40 \n",
" 1.245300 \n",
" \n",
" \n",
" 41 \n",
" 1.109700 \n",
" \n",
" \n",
" 42 \n",
" 1.453200 \n",
" \n",
" \n",
" 43 \n",
" 0.797300 \n",
" \n",
" \n",
" 44 \n",
" 1.013500 \n",
" \n",
" \n",
" 45 \n",
" 0.786900 \n",
" \n",
" \n",
" 46 \n",
" 0.894800 \n",
" \n",
" \n",
" 47 \n",
" 1.177700 \n",
" \n",
" \n",
" 48 \n",
" 1.086000 \n",
" \n",
" \n",
" 49 \n",
" 1.651600 \n",
" \n",
" \n",
" 50 \n",
" 0.895400 \n",
" \n",
" \n",
" 51 \n",
" 1.011800 \n",
" \n",
" \n",
" 52 \n",
" 1.160500 \n",
" \n",
" \n",
" 53 \n",
" 1.573800 \n",
" \n",
" \n",
" 54 \n",
" 1.163000 \n",
" \n",
" \n",
" 55 \n",
" 1.072900 \n",
" \n",
" \n",
" 56 \n",
" 1.035000 \n",
" \n",
" \n",
" 57 \n",
" 0.611800 \n",
" \n",
" \n",
" 58 \n",
" 0.743100 \n",
" \n",
" \n",
" 59 \n",
" 1.096500 \n",
" \n",
" \n",
" 60 \n",
" 1.139200 \n",
" \n",
" \n",
" 61 \n",
" 5.855300 \n",
" \n",
" \n",
" 62 \n",
" 1.619300 \n",
" \n",
" \n",
" 63 \n",
" 2.743400 \n",
" \n",
" \n",
" 64 \n",
" 6.773300 \n",
" \n",
" \n",
" 65 \n",
" 1.551500 \n",
" \n",
" \n",
" 66 \n",
" 1.018600 \n",
" \n",
" \n",
" 67 \n",
" 1.284600 \n",
" \n",
" \n",
" 68 \n",
" 1.117300 \n",
" \n",
" \n",
" 69 \n",
" 1.509800 \n",
" \n",
" \n",
" 70 \n",
" 1.074700 \n",
" \n",
" \n",
" 71 \n",
" 0.755100 \n",
" \n",
" \n",
" 72 \n",
" 0.497300 \n",
" \n",
" \n",
" 73 \n",
" 0.875900 \n",
" \n",
" \n",
" 74 \n",
" 0.597500 \n",
" \n",
" \n",
" 75 \n",
" 0.563600 \n",
" \n",
" \n",
" 76 \n",
" 5.032300 \n",
" \n",
" \n",
" 77 \n",
" 0.779900 \n",
" \n",
" \n",
" 78 \n",
" 0.679300 \n",
" \n",
" \n",
" 79 \n",
" 0.577900 \n",
" \n",
" \n",
" 80 \n",
" 0.581000 \n",
" \n",
" \n",
" 81 \n",
" 0.407500 \n",
" \n",
" \n",
" 82 \n",
" 0.902900 \n",
" \n",
" \n",
" 83 \n",
" 0.644100 \n",
" \n",
" \n",
" 84 \n",
" 1.133200 \n",
" \n",
" \n",
" 85 \n",
" 1.693800 \n",
" \n",
" \n",
" 86 \n",
" 0.898600 \n",
" \n",
" \n",
" 87 \n",
" 0.657700 \n",
" \n",
" \n",
" 88 \n",
" 1.592000 \n",
" \n",
" \n",
" 89 \n",
" 0.733100 \n",
" \n",
" \n",
" 90 \n",
" 4.917600 \n",
" \n",
" \n",
" 91 \n",
" 10.263400 \n",
" \n",
" \n",
" 92 \n",
" 15.912000 \n",
" \n",
" \n",
" 93 \n",
" 1.027500 \n",
" \n",
" \n",
" 94 \n",
" 0.480500 \n",
" \n",
" \n",
" 95 \n",
" 0.616700 \n",
" \n",
" \n",
" 96 \n",
" 0.356600 \n",
" \n",
" \n",
" 97 \n",
" 0.444900 \n",
" \n",
" \n",
" 98 \n",
" 0.373300 \n",
" \n",
" \n",
" 99 \n",
" 0.169900 \n",
" \n",
" \n",
" 100 \n",
" 0.421400 \n",
" \n",
" \n",
" 101 \n",
" 0.390900 \n",
" \n",
" \n",
" 102 \n",
" 0.608500 \n",
" \n",
" \n",
" 103 \n",
" 0.467300 \n",
" \n",
" \n",
" 104 \n",
" 0.441600 \n",
" \n",
" \n",
" 105 \n",
" 0.422500 \n",
" \n",
" \n",
" 106 \n",
" 0.222500 \n",
" \n",
" \n",
" 107 \n",
" 5.019700 \n",
" \n",
" \n",
" 108 \n",
" 0.546100 \n",
" \n",
" \n",
" 109 \n",
" 0.521400 \n",
" \n",
" \n",
" 110 \n",
" 0.753000 \n",
" \n",
" \n",
" 111 \n",
" 0.584300 \n",
" \n",
" \n",
" 112 \n",
" 0.837300 \n",
" \n",
" \n",
" 113 \n",
" 0.472400 \n",
" \n",
" \n",
" 114 \n",
" 0.333900 \n",
" \n",
" \n",
" 115 \n",
" 0.428700 \n",
" \n",
" \n",
" 116 \n",
" 0.467700 \n",
" \n",
" \n",
" 117 \n",
" 0.369400 \n",
" \n",
" \n",
" 118 \n",
" 0.344600 \n",
" \n",
" \n",
" 119 \n",
" 0.395600 \n",
" \n",
" \n",
" 120 \n",
" 0.357300 \n",
" \n",
" \n",
" 121 \n",
" 0.300700 \n",
" \n",
" \n",
" 122 \n",
" 0.325000 \n",
" \n",
" \n",
" 123 \n",
" 1.147700 \n",
" \n",
" \n",
" 124 \n",
" 0.571100 \n",
" \n",
" \n",
" 125 \n",
" 0.716600 \n",
" \n",
" \n",
" 126 \n",
" 0.876000 \n",
" \n",
" \n",
" 127 \n",
" 0.438000 \n",
" \n",
" \n",
" 128 \n",
" 0.252000 \n",
" \n",
" \n",
" 129 \n",
" 0.387100 \n",
" \n",
" \n",
" 130 \n",
" 0.275600 \n",
" \n",
" \n",
" 131 \n",
" 0.732000 \n",
" \n",
" \n",
" 132 \n",
" 0.538800 \n",
" \n",
" \n",
" 133 \n",
" 0.491600 \n",
" \n",
" \n",
" 134 \n",
" 0.258900 \n",
" \n",
" \n",
" 135 \n",
" 0.256400 \n",
" \n",
" \n",
" 136 \n",
" 0.289900 \n",
" \n",
" \n",
" 137 \n",
" 0.243500 \n",
" \n",
" \n",
" 138 \n",
" 0.282100 \n",
" \n",
" \n",
" 139 \n",
" 0.436600 \n",
" \n",
" \n",
" 140 \n",
" 0.329900 \n",
" \n",
" \n",
" 141 \n",
" 0.192600 \n",
" \n",
" \n",
" 142 \n",
" 0.283100 \n",
" \n",
" \n",
" 143 \n",
" 0.248700 \n",
" \n",
" \n",
" 144 \n",
" 0.236600 \n",
" \n",
" \n",
" 145 \n",
" 0.392700 \n",
" \n",
" \n",
" 146 \n",
" 0.541900 \n",
" \n",
" \n",
" 147 \n",
" 0.308900 \n",
" \n",
" \n",
" 148 \n",
" 0.215700 \n",
" \n",
" \n",
" 149 \n",
" 0.292200 \n",
" \n",
" \n",
" 150 \n",
" 0.280200 \n",
" \n",
" \n",
" 151 \n",
" 0.228000 \n",
" \n",
" \n",
" 152 \n",
" 0.215800 \n",
" \n",
" \n",
" 153 \n",
" 0.183700 \n",
" \n",
" \n",
" 154 \n",
" 0.360800 \n",
" \n",
" \n",
" 155 \n",
" 0.106800 \n",
" \n",
" \n",
" 156 \n",
" 0.110200 \n",
" \n",
" \n",
" 157 \n",
" 0.172400 \n",
" \n",
" \n",
" 158 \n",
" 0.151800 \n",
" \n",
" \n",
" 159 \n",
" 0.337000 \n",
" \n",
" \n",
" 160 \n",
" 0.206900 \n",
" \n",
" \n",
" 161 \n",
" 0.350600 \n",
" \n",
" \n",
" 162 \n",
" 0.095200 \n",
" \n",
" \n",
" 163 \n",
" 0.127200 \n",
" \n",
" \n",
" 164 \n",
" 0.146800 \n",
" \n",
" \n",
" 165 \n",
" 0.191800 \n",
" \n",
" \n",
" 166 \n",
" 0.278800 \n",
" \n",
" \n",
" 167 \n",
" 0.203200 \n",
" \n",
" \n",
" 168 \n",
" 0.193900 \n",
" \n",
" \n",
" 169 \n",
" 0.062100 \n",
" \n",
" \n",
" 170 \n",
" 0.158200 \n",
" \n",
" \n",
" 171 \n",
" 0.173700 \n",
" \n",
" \n",
" 172 \n",
" 0.276500 \n",
" \n",
" \n",
" 173 \n",
" 0.247900 \n",
" \n",
" \n",
" 174 \n",
" 0.246000 \n",
" \n",
" \n",
" 175 \n",
" 0.177600 \n",
" \n",
" \n",
" 176 \n",
" 0.166400 \n",
" \n",
" \n",
" 177 \n",
" 0.153300 \n",
" \n",
" \n",
" 178 \n",
" 0.104300 \n",
" \n",
" \n",
" 179 \n",
" 0.082900 \n",
" \n",
" \n",
" 180 \n",
" 0.120300 \n",
" \n",
" \n",
" 181 \n",
" 0.193300 \n",
" \n",
" \n",
" 182 \n",
" 0.158800 \n",
" \n",
" \n",
" 183 \n",
" 0.101400 \n",
" \n",
" \n",
" 184 \n",
" 0.051500 \n",
" \n",
" \n",
" 185 \n",
" 0.065700 \n",
" \n",
" \n",
" 186 \n",
" 0.492000 \n",
" \n",
" \n",
" 187 \n",
" 0.102400 \n",
" \n",
" \n",
" 188 \n",
" 0.196900 \n",
" \n",
" \n",
" 189 \n",
" 0.202600 \n",
" \n",
" \n",
" 190 \n",
" 0.115700 \n",
" \n",
" \n",
" 191 \n",
" 0.079100 \n",
" \n",
" \n",
" 192 \n",
" 0.115200 \n",
" \n",
" \n",
" 193 \n",
" 0.182000 \n",
" \n",
" \n",
" 194 \n",
" 0.114600 \n",
" \n",
" \n",
" 195 \n",
" 0.132200 \n",
" \n",
" \n",
" 196 \n",
" 0.135900 \n",
" \n",
" \n",
" 197 \n",
" 0.063800 \n",
" \n",
" \n",
" 198 \n",
" 0.049400 \n",
" \n",
" \n",
" 199 \n",
" 0.056400 \n",
" \n",
" \n",
" 200 \n",
" 0.060900 \n",
" \n",
" \n",
" 201 \n",
" 0.111500 \n",
" \n",
" \n",
" 202 \n",
" 0.079900 \n",
" \n",
" \n",
" 203 \n",
" 0.120600 \n",
" \n",
" \n",
" 204 \n",
" 0.034400 \n",
" \n",
" \n",
" 205 \n",
" 0.044000 \n",
" \n",
" \n",
" 206 \n",
" 0.088100 \n",
" \n",
" \n",
" 207 \n",
" 0.041800 \n",
" \n",
" \n",
" 208 \n",
" 0.055100 \n",
" \n",
" \n",
" 209 \n",
" 0.067700 \n",
" \n",
" \n",
" 210 \n",
" 0.084300 \n",
" \n",
" \n",
" 211 \n",
" 0.018500 \n",
" \n",
" \n",
" 212 \n",
" 0.031700 \n",
" \n",
" \n",
" 213 \n",
" 0.023000 \n",
" \n",
" \n",
" 214 \n",
" 0.049500 \n",
" \n",
" \n",
" 215 \n",
" 0.050600 \n",
" \n",
" \n",
" 216 \n",
" 0.071000 \n",
" \n",
" \n",
" 217 \n",
" 0.064800 \n",
" \n",
" \n",
" 218 \n",
" 0.034000 \n",
" \n",
" \n",
" 219 \n",
" 0.025900 \n",
" \n",
" \n",
" 220 \n",
" 0.028800 \n",
" \n",
" \n",
" 221 \n",
" 0.017200 \n",
" \n",
" \n",
" 222 \n",
" 0.040000 \n",
" \n",
" \n",
" 223 \n",
" 0.047600 \n",
" \n",
" \n",
" 224 \n",
" 0.057600 \n",
" \n",
" \n",
" 225 \n",
" 0.013700 \n",
" \n",
" \n",
" 226 \n",
" 0.008300 \n",
" \n",
" \n",
" 227 \n",
" 0.027100 \n",
" \n",
" \n",
" 228 \n",
" 0.016100 \n",
" \n",
" \n",
" 229 \n",
" 0.017700 \n",
" \n",
" \n",
" 230 \n",
" 0.068300 \n",
" \n",
" \n",
" 231 \n",
" 0.017000 \n",
" \n",
" \n",
" 232 \n",
" 0.010700 \n",
" \n",
" \n",
" 233 \n",
" 0.010600 \n",
" \n",
" \n",
" 234 \n",
" 0.009200 \n",
" \n",
" \n",
" 235 \n",
" 0.011800 \n",
" \n",
" \n",
" 236 \n",
" 0.005800 \n",
" \n",
" \n",
" 237 \n",
" 0.022100 \n",
" \n",
" \n",
" 238 \n",
" 0.025600 \n",
" \n",
" \n",
" 239 \n",
" 0.018900 \n",
" \n",
" \n",
" 240 \n",
" 0.010100 \n",
" \n",
" \n",
" 241 \n",
" 0.003800 \n",
" \n",
" \n",
" 242 \n",
" 0.005600 \n",
" \n",
" \n",
" 243 \n",
" 0.007100 \n",
" \n",
" \n",
" 244 \n",
" 0.012300 \n",
" \n",
" \n",
" 245 \n",
" 0.008400 \n",
" \n",
" \n",
" 246 \n",
" 0.005600 \n",
" \n",
" \n",
" 247 \n",
" 0.006700 \n",
" \n",
" \n",
" 248 \n",
" 0.007800 \n",
" \n",
" \n",
" 249 \n",
" 0.007800 \n",
" \n",
" \n",
" 250 \n",
" 0.004900 \n",
" \n",
" \n",
" 251 \n",
" 0.005500 \n",
" \n",
" \n",
" 252 \n",
" 0.008000 \n",
" \n",
" \n",
" 253 \n",
" 0.006200 \n",
" \n",
" \n",
" 254 \n",
" 0.003100 \n",
" \n",
" \n",
" 255 \n",
" 0.007700 \n",
" \n",
" \n",
" 256 \n",
" 0.009300 \n",
" \n",
" \n",
" 257 \n",
" 0.006900 \n",
" \n",
" \n",
" 258 \n",
" 0.005300 \n",
" \n",
" \n",
" 259 \n",
" 0.003200 \n",
" \n",
" \n",
" 260 \n",
" 0.004200 \n",
" \n",
" \n",
" 261 \n",
" 0.005800 \n",
" \n",
" \n",
" 262 \n",
" 0.006500 \n",
" \n",
" \n",
" 263 \n",
" 0.002100 \n",
" \n",
" \n",
" 264 \n",
" 0.005400 \n",
" \n",
" \n",
" 265 \n",
" 0.008900 \n",
" \n",
" \n",
" 266 \n",
" 0.003100 \n",
" \n",
" \n",
" 267 \n",
" 0.009100 \n",
" \n",
" \n",
" 268 \n",
" 0.006200 \n",
" \n",
" \n",
" 269 \n",
" 0.007700 \n",
" \n",
" \n",
" 270 \n",
" 0.001200 \n",
" \n",
" \n",
" 271 \n",
" 0.003000 \n",
" \n",
" \n",
" 272 \n",
" 0.003200 \n",
" \n",
" \n",
" 273 \n",
" 0.004000 \n",
" \n",
" \n",
" 274 \n",
" 0.007700 \n",
" \n",
" \n",
" 275 \n",
" 0.001800 \n",
" \n",
" \n",
" 276 \n",
" 0.002400 \n",
" \n",
" \n",
" 277 \n",
" 0.006400 \n",
" \n",
" \n",
" 278 \n",
" 0.005500 \n",
" \n",
" \n",
" 279 \n",
" 0.002700 \n",
" \n",
" \n",
" \n",
"280 \n",
" 0.006600 \n",
" ### Instruction:\\nExplain how can I stay up to date with the CAMEL community.\\n\\n### Input:\\n\\n\\n### Response:\\nTo keep up to date with the CAMEL community, engage in discussions, contribute to the Discord, and provide support to new contributors.']"
]
},
"metadata": {},
"execution_count": 11
}
],
"source": [
"FastLanguageModel.for_inference(model) # Enable native 2x faster inference\n",
"inputs = tokenizer(\n",
"[\n",
"\n",
" AlpacaItem(\n",
" instruction=\"Explain how can I stay up to date with the CAMEL community.\",\n",
" input=\"\",\n",
" output=\"\", # leave this blank for generation!\n",
" ).to_string()\n",
"\n",
"], return_tensors = \"pt\").to(\"cuda\")\n",
"\n",
"outputs = model.generate(**inputs, max_new_tokens = 512, use_cache = True)\n",
"tokenizer.batch_decode(outputs)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "xSepmhPrgOct"
},
"source": [
"**Summary**\n",
"\n",
"\n",
"We have generated realistic user queries and responses from a real page and trained on them to produce a model that understands the underlying content."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "_75hf2ZmQ1-M"
},
"source": [
"That's everything: Got questions about 🐫 CAMEL-AI? Join us on [Discord](https://discord.camel-ai.org)! Whether you want to share feedback, explore the latest in multi-agent systems, get support, or connect with others on exciting projects, we’d love to have you in the community! 🤝\n",
"\n",
"Check out some of our other work:\n",
"\n",
"1. 🐫 Creating Your First CAMEL Agent [free Colab](https://docs.camel-ai.org/cookbooks/create_your_first_agent.html)\n",
"\n",
"2. Graph RAG Cookbook [free Colab](https://colab.research.google.com/drive/1uZKQSuu0qW6ukkuSv9TukLB9bVaS1H0U?usp=sharing)\n",
"\n",
"3. 🧑⚖️ Create A Hackathon Judge Committee with Workforce [free Colab](https://colab.research.google.com/drive/18ajYUMfwDx3WyrjHow3EvUMpKQDcrLtr?usp=sharing)\n",
"\n",
"4. 🔥 3 ways to ingest data from websites with Firecrawl & CAMEL [free Colab](https://colab.research.google.com/drive/1lOmM3VmgR1hLwDKdeLGFve_75RFW0R9I?usp=sharing)\n",
"\n",
"5. 🦥 Agentic SFT Data Generation with CAMEL and Mistral Models, Fine-Tuned with Unsloth [free Colab](https://colab.research.google.com/drive/1lYgArBw7ARVPSpdwgKLYnp_NEXiNDOd-?usp=sharing)\n",
"\n",
"6. 🦥 Agentic SFT Data Generation with CAMEL and Qwen Models, Fine-Tuned with Unsloth [free Colab](https://colab.research.google.com/drive/1sMnWOvdmASEMhsRIOUSAeYuEywby6FRV?usp=sharing)\n",
"\n",
"Thanks from everyone at 🐫 CAMEL-AI\n",
"\n",
"\n",
"\n",
"
\n",
" \n",
"⭐ Star us on Github , join our [*Discord*](https://discord.camel-ai.org) or follow our [*X*](https://x.com/camelaiorg) ⭐\n",
"