structure updates
This commit is contained in:
117
tech_docs/python/PyTorch.md
Normal file
117
tech_docs/python/PyTorch.md
Normal file
@@ -0,0 +1,117 @@
|
||||
`PyTorch` is an open-source machine learning library widely used for applications such as computer vision and natural language processing. It's known for its flexibility, speed, and ease of use. Developed by Facebook's AI Research lab, PyTorch provides two high-level features: Tensor computation (like NumPy) with strong GPU acceleration and Deep Neural Networks built on a tape-based autograd system. Here’s a concise reference guide for common use cases with `PyTorch`:
|
||||
|
||||
# `PyTorch` Reference Guide
|
||||
|
||||
## Installation
|
||||
```
|
||||
pip install torch torchvision
|
||||
```
|
||||
|
||||
## Basic Concepts
|
||||
|
||||
### Importing PyTorch
|
||||
```python
|
||||
import torch
|
||||
import torch.nn as nn
|
||||
import torch.optim as optim
|
||||
import torchvision.transforms as transforms
|
||||
```
|
||||
|
||||
### Tensors
|
||||
Tensors are a specialized data structure that are very similar to arrays and matrices. In PyTorch, we use tensors to encode the inputs and outputs of a model, as well as the model’s parameters.
|
||||
|
||||
```python
|
||||
# Create a tensor
|
||||
x = torch.Tensor([[1, 2], [3, 4]])
|
||||
|
||||
# Create a tensor with random data
|
||||
random_tensor = torch.rand(2, 3)
|
||||
|
||||
# Create a tensor with zeros
|
||||
zeros_tensor = torch.zeros(2, 3)
|
||||
|
||||
# Check if your system supports CUDA
|
||||
device = "cuda" if torch.cuda.is_available() else "cpu"
|
||||
```
|
||||
|
||||
## Building Models
|
||||
|
||||
### Defining a Neural Network
|
||||
```python
|
||||
class Net(nn.Module):
|
||||
def __init__(self):
|
||||
super(Net, self).__init__()
|
||||
self.conv1 = nn.Conv2d(1, 6, 5)
|
||||
self.pool = nn.MaxPool2d(2, 2)
|
||||
self.conv2 = nn.Conv2d(6, 16, 5)
|
||||
self.fc1 = nn.Linear(16 * 5 * 5, 120)
|
||||
self.fc2 = nn.Linear(120, 84)
|
||||
self.fc3 = nn.Linear(84, 10)
|
||||
|
||||
def forward(self, x):
|
||||
x = self.pool(F.relu(self.conv1(x)))
|
||||
x = self.pool(F.relu(self.conv2(x)))
|
||||
x = x.view(-1, 16 * 5 * 5)
|
||||
x = F.relu(self.fc1(x))
|
||||
x = F.relu(self.fc2(x))
|
||||
return self.fc3(x)
|
||||
|
||||
net = Net().to(device)
|
||||
```
|
||||
|
||||
## Training a Model
|
||||
|
||||
### Loss Function and Optimizer
|
||||
```python
|
||||
criterion = nn.CrossEntropyLoss()
|
||||
optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)
|
||||
```
|
||||
|
||||
### Training Loop
|
||||
```python
|
||||
for epoch in range(num_epochs):
|
||||
running_loss = 0.0
|
||||
for i, data in enumerate(trainloader, 0):
|
||||
inputs, labels = data[0].to(device), data[1].to(device)
|
||||
|
||||
optimizer.zero_grad()
|
||||
|
||||
outputs = net(inputs)
|
||||
loss = criterion(outputs, labels)
|
||||
loss.backward()
|
||||
optimizer.step()
|
||||
|
||||
running_loss += loss.item()
|
||||
print(f'Epoch {epoch + 1}, Loss: {running_loss / len(trainloader)}')
|
||||
```
|
||||
|
||||
## Evaluating the Model
|
||||
```python
|
||||
correct = 0
|
||||
total = 0
|
||||
with torch.no_grad():
|
||||
for data in testloader:
|
||||
images, labels = data[0].to(device), data[1].to(device)
|
||||
outputs = net(images)
|
||||
_, predicted = torch.max(outputs.data, 1)
|
||||
total += labels.size(0)
|
||||
correct += (predicted == labels).sum().item()
|
||||
|
||||
print(f'Accuracy of the network on test images: {100 * correct / total}%')
|
||||
```
|
||||
|
||||
## Saving and Loading Models
|
||||
```python
|
||||
# Save
|
||||
torch.save(net.state_dict(), 'model.pth')
|
||||
|
||||
# Load
|
||||
net = Net()
|
||||
net.load_state_dict(torch.load('model.pth'))
|
||||
net.to(device)
|
||||
```
|
||||
|
||||
`PyTorch` excels in offering flexibility and speed during the development of complex machine learning models. Its dynamic computation graph paradigm allows modifications to the graph on the fly and maps to Pythonic programming closely. This guide covers foundational concepts and tasks in PyTorch, but the library’s capabilities extend to support advanced machine learning and artificial intelligence projects.
|
||||
|
||||
|
||||
PyTorch's intuitive design and ease of use, along with its comprehensive documentation and vibrant community, make it a preferred tool for both academic researchers and developers in industry.
|
||||
Reference in New Issue
Block a user