Technology Sharing

Shengsi Big Model——Use of MindFormers----Installation and Configuration Environment from Scratch

2024-07-12

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

         

The goal of the MindSpore Transformers suite is to build a full-process development suite for large-model training, fine-tuning, evaluation, reasoning, and deployment, providing the industry's mainstream Transformer-type pre-trained models and SOTA downstream task applications, covering a wealth of parallel features. It is expected to help users easily implement large-model training and innovative R&D.

MindSpore Transformers is based on MindSpore's built-in parallel technology and component-based design, and has the following features:

  • One line of code enables seamless switching from single-card to large-scale cluster training;
  • Provide flexible and easy-to-use personalized parallel configuration;
  • It can automatically perceive topology and efficiently integrate data parallelism and model parallelism strategies;
  • Start single-card/multi-card training, fine-tuning, evaluation, and reasoning processes for any task with one click;
  • Supports users to configure any module in a componentized manner, such as optimizer, learning strategy, network assembly, etc.
  • Provides high-level easy-to-use interfaces such as Trainer, pipeline, and AutoClass;
  • Provides automatic download and loading functions for preset SOTA weights;
  • Support seamless migration and deployment of artificial intelligence computing centers;

Mindspore Large Model Platform (mindspore.cn)

mindformers: The goal of the MindSpore Transformers suite is to build a full-process suite for large-model training, reasoning, and deployment: Provide the industry's mainstream Transformer-type pre-trained models, covering a wealth of parallel features. Expect to help users easily implement large-model training. Documentation: https://mindformers.readthedocs.io/zh-cn/latest/ (gitee.com)

1. Installation

Install git first in Linux Ubuntu environment

sudo apt install git

Get mindformers

git clone -b r1.1.0 https://gitee.com/mindspore/mindformers.git

Enter the directory and execute the script

cd mindformers
bash build.sh

This script requires the installation of Python's setuptools library. Run the command in the link

No module named ‘distutils.cmd_no module named 'distutils.cmd-CSDN博客

sudo apt-get install python3.7-distutils   3.7

Python version must be at least 3.7, I recommend installing 3.9

Ubuntu upgrades Python to 3.7_apt-get update python3.7-CSDN blog

python3

Type "exit()" to return to normal command line mode

build.h may report an error ERROR: Invalid requirement: 'mindformers*whl'

Change the python in the script to python3

Press Esc and type ":wq!" to save and exit

Run again

bash build.sh

Successfully run and the installation is complete.

Sometimes you need a different version of Python, note that python3 is different from python. I remember it can also be configured.

  1. whereis python3
  2. rm /usr/bin/python3
  3. ln -s /usr/bin/python3.9 /usr/bin/python3

If it is 3.9

sudo apt-get install python3.9-distutils   3.9