Technology sharing

Quomodo disciplina perseverandum est post exemplar salvificum pytorchi localiter?

2024-07-12

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

Apud PyTorch, exemplar servare et onerare potes, et formare his gradibus sequendo:

  1. Salvum exemplum

    Solent duo modi servare exemplar:

    • Serva totum exemplar (including structuram retialem, pondera, etc.);

      torch.save(model, 'model.pth')
    • Solus status_dict exemplaris (tantum pondus parametri continens) salvatur. Haec methodus commendatur quia spatium repositionis servat et flexibilior est cum oneratione:

      torch.save(model.state_dict(), 'model_weights.pth')
  2. exemplar onus

    Correspondentes dupliciter potest onerare exemplar;

    • Si totum exemplar servasti prius, hoc modo directe onerare potes:

      model = torch.load('model.pth')
    • Si modo state_dict prius servata est, exemplar necesse est instantia eadem structura qua exemplar originale, et deinde transire.load_state_dict()Modum onerare ponderibus;

      1. # 实例化一个与原模型结构相同的模型
      2. model = YourModelClass()
      3. # 加载保存的state_dict
      4. model.load_state_dict(torch.load('model_weights.pth'))
      5. # 确保将模型转移到正确的设备上(例如GPU或CPU)
      6. device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
      7. model.to(device)
  3. continue disciplina

    Exemplar levato, exercere pergere potes. Fac deminutionem munus et optimizer et statuas eorum recte oneratas (si antea servaveris).Deinde, modo sequere processum consuetum disciplinae

    1. # 定义损失函数和优化器
    2. criterion = nn.CrossEntropyLoss()
    3. optimizer = optim.SGD(model.parameters(), lr=0.01, momentum=0.9)
    4. # 如果之前保存了优化器状态,也可以加载
    5. optimizer.load_state_dict(torch.load('optimizer.pth'))
    6. # 开始训练
    7. for epoch in range(num_epochs):
    8. for inputs, labels in dataloader:
    9. inputs, labels = inputs.to(device), labels.to(device)
    10. optimizer.zero_grad()
    11. outputs = model(inputs)
    12. loss = criterion(outputs, labels)
    13. loss.backward()
    14. optimizer.step()

Hoc modo exemplar instituere potes unde tandem illud servasti.