
[ad_1]
go through Hong Kong Economic Times July 19, 2024
originalPublished in Hong Kong Economic Times Financial NewsEJ Tech Innovation Lab“
In the past, training large language models (LLMs) could only be done using cloud computing power, and it was hard to imagine building one at home. Taiwan’s GIGABYTE Technology recently launched a new product targeting the home market.roll outLocal AI model training and fine-tuning utilitiesAI TOP Utilitiesallowing users to train up to 100 million units locally using the Linux operating system, Gigabyte motherboard, GPU, SSD and power supply.236 billion parametersThe AI model is cost-effective, provides instant results, and can properly control sensitive data.

Cost-effective and simplified process
AI TOP Utility can offload AI training data to system memory or even SSD, thus breaking the limitation of VRAM (video random access memory) size. The tool claims to simplify the workflow. Beginners can easily start their own AI projects with just a few clicks, without entering complex command parameters or any AI programming skills. The tool has a visual dashboard to monitor the progress of AI training and the status of system hardware. In addition, you can arrange the time for AI training, such as avoiding peak electricity consumption or running during periods with lower electricity costs.
The tool can be used with GIGABYTE hardware, such as Nvidia GeForce RTX 40 series, AMD Radeon RX 7900 series, Radeon Pro W7900 and W7800 series graphics cards. Currently, it supports more than 70 open source LLM backbone models on the AI collaboration platform Hugging Face, including Baichuan 2, Distill-GPT2, GLM4, Llama 2 and Llama 3. GIGABYTE also has an AI TOP Tutor technical support service to ask questions to the chat robot.

Support EJ Tech

If you want to submit articles, report information, issue press releases or interview notices,Click here to contact us.
//
[ad_2]
Source link