Skip to content

Commit 855ad97

Browse files
committed
clean up sizing/spacing of text for better readability. add scroll on mobile view for ecosystem links. add styling to text links.
1 parent bd2711e commit 855ad97

File tree

1 file changed

+36
-30
lines changed

1 file changed

+36
-30
lines changed

frontend/gpt4all.io/src/App.js

+36-30
Original file line numberDiff line numberDiff line change
@@ -224,7 +224,7 @@ function App() {
224224

225225
</div>
226226
<div className={`w-full flex justify-center ${showMore ? "h-full": "max-h-[300px]"} mt-12`}>
227-
<div className='grid grid-cols-1 lg:grid-cols-3 px-4 sm:px-8 md:px-36 relative gap-4 mx-auto h-full overflow-hidden'>
227+
<div className='grid grid-cols-1 lg:grid-cols-3 px-4 sm:px-8 md:px-36 relative gap-4 mx-auto h-full overflow-y-auto 2md:overflow-hidden'>
228228
{
229229
ecosystem_links.map((obj, idx) =>
230230
<EcosystemItem
@@ -247,19 +247,21 @@ function App() {
247247
</div>
248248
<div className='w-full px-4 sm:px-8 md:px-36 flex flex-col justify-center items-center mt-14 gap-8'>
249249
<h2 className='text-4xl font-bold text-center'>How GPT4All Works</h2>
250-
<p className='leading-normal w-full lg:w-2/3 mx-auto'>
251-
GPT4All is an ecosystem to train and deploy <b>powerful</b> and <b>customized</b> large language models that run <b>locally</b> on consumer grade CPUs.
252-
253-
</p>
254-
<p className='leading-normal w-full lg:w-2/3 mx-auto'>
255-
The goal is simple - be the best instruction tuned assistant-style language model that any person or enterprise can freely use, distribute and build on.
256-
</p>
250+
251+
<div className='w-full lg:w-2/3 px-0 space-y-8 xl:px-32'>
252+
<p className='leading-relaxed'>
253+
GPT4All is an ecosystem to train and deploy <b>powerful</b> and <b>customized</b> large language models that run <b>locally</b> on consumer grade CPUs.
254+
</p>
255+
<p className='leading-relaxed'>
256+
The goal is simple - be the best instruction tuned assistant-style language model that any person or enterprise can freely use, distribute and build on.
257+
</p>
257258

258-
<p className='leading-normal w-full lg:w-2/3 mx-auto'>
259-
A GPT4All model is a 3GB - 8GB file that you can download and plug into the GPT4All open-source ecosystem software. <b>Nomic AI</b> supports and maintains
260-
this software ecosystem to enforce quality and security alongside spearheading the effort to allow any person or enterprise to easily train and deploy
261-
their own on-edge large language models.
262-
</p>
259+
<p className='leading-relaxed'>
260+
A GPT4All model is a 3GB - 8GB file that you can download and plug into the GPT4All open-source ecosystem software. <b>Nomic AI</b> supports and maintains
261+
this software ecosystem to enforce quality and security alongside spearheading the effort to allow any person or enterprise to easily train and deploy
262+
their own on-edge large language models.
263+
</p>
264+
</div>
263265

264266

265267
</div>
@@ -270,28 +272,32 @@ function App() {
270272

271273
<div className='w-full px-4 sm:px-8 md:px-36 flex flex-col justify-center items-center mt-14 gap-8'>
272274
<h2 className='text-4xl font-bold text-center'>GPT4All Datasets</h2>
273-
<p className='leading-normal w-full lg:w-2/3 mx-auto'>
274-
To train a powerful instruction-tuned assistant on your own data, you need to curate high-quality training and instruction-tuning datasets. Nomic AI has built
275-
a platform called <b><a href="https://atlas.nomic.ai/">Atlas</a></b> to make manipulating and curating LLM training data easy.
276-
</p>
277-
<p className='leading-normal w-full lg:w-2/3 mx-auto'>
278-
You can find the latest open-source, Atlas-curated GPT4All dataset on <b><a href="https://huggingface.co/datasets/nomic-ai/gpt4all-j-prompt-generations">Huggingface</a></b>.
279-
Make sure to use the latest data version.
280-
</p>
275+
<div className='w-full lg:w-2/3 px-0 space-y-8 xl:px-32'>
276+
<p className='leading-relaxed'>
277+
To train a powerful instruction-tuned assistant on your own data, you need to curate high-quality training and instruction-tuning datasets. Nomic AI has built
278+
a platform called <b><a className='underline' href="https://atlas.nomic.ai/">Atlas</a></b> to make manipulating and curating LLM training data easy.
279+
</p>
280+
<p className='leading-relaxed'>
281+
You can find the latest open-source, Atlas-curated GPT4All dataset on <b><a className='underline' href="https://huggingface.co/datasets/nomic-ai/gpt4all-j-prompt-generations">Huggingface</a></b>.
282+
Make sure to use the latest data version.
283+
</p>
284+
</div>
281285

282286
</div>
283287

284288
<div className='w-full px-4 sm:px-8 md:px-36 flex flex-col justify-center items-center mt-14 gap-8'>
285289
<h2 className='text-4xl font-bold text-center'>GPT4All Open Source Datalake</h2>
286-
<p className='leading-normal w-full lg:w-2/3 mx-auto'>
287-
Data is one the most important ingredients to successfully building a powerful, general purpose large language model. The GPT4All community has built the GPT4All Open Source datalake
288-
as a staging ground for contributing instruction and assistant tuning data for future GPT4All model trains. It allows anyone to contribute to the democratic process of training
289-
a large language model.
290-
</p>
291-
<p className='leading-normal w-full lg:w-2/3 mx-auto'>
292-
All data contributions to the GPT4All Datalake will be open-sourced in their raw and Atlas-curated form. You can learn more details about the datalake on <b><a href="https://github.com/nomic-ai/gpt4all-datalake">Github</a></b>. You can contribute by using the GPT4All Chat client and 'opting-in' to
293-
share your data on start-up. By default, the chat client will not let any conversation history leave your computer.
294-
</p>
290+
<div className='w-full lg:w-2/3 px-0 space-y-8 xl:px-32'>
291+
<p className='leading-relaxed'>
292+
Data is one the most important ingredients to successfully building a powerful, general purpose large language model. The GPT4All community has built the GPT4All Open Source datalake
293+
as a staging ground for contributing instruction and assistant tuning data for future GPT4All model trains. It allows anyone to contribute to the democratic process of training
294+
a large language model.
295+
</p>
296+
<p className='leading-relaxed'>
297+
All data contributions to the GPT4All Datalake will be open-sourced in their raw and Atlas-curated form. You can learn more details about the datalake on <b><a className='underline' href="https://github.com/nomic-ai/gpt4all-datalake">Github</a></b>. You can contribute by using the GPT4All Chat client and 'opting-in' to
298+
share your data on start-up. By default, the chat client will not let any conversation history leave your computer.
299+
</p>
300+
</div>
295301

296302
<p>
297303
Explore a recent snapshot of the GPT4All Datalake in Atlas below.

0 commit comments

Comments
 (0)