Home > Technology peripherals > AI > body text

causal convolutional neural network

PHPz
Release: 2024-01-24 12:42:11
forward
647 people have browsed it

causal convolutional neural network

Causal convolutional neural network is a special convolutional neural network designed for causality problems in time series data. Compared with conventional convolutional neural networks, causal convolutional neural networks have unique advantages in retaining the causal relationship of time series and are widely used in the prediction and analysis of time series data.

The core idea of ​​causal convolutional neural network is to introduce causality in the convolution operation. Traditional convolutional neural networks can simultaneously perceive data before and after the current time point, but in time series prediction, this may lead to information leakage problems. Because the prediction results at the current time point will be affected by the data at future time points. The causal convolutional neural network solves this problem. It can only perceive the current time point and previous data, but cannot perceive future data, thereby ensuring the causal relationship of time series data. Therefore, causal convolutional neural networks can better handle the prediction and analysis problems of time series data.

There are many ways to implement causal convolutional neural networks, one of the common methods is to use causal convolution kernels. The causal convolution kernel is a special convolution kernel that can only perceive the current time point and previous data, but cannot perceive future data. This design ensures that the convolution results will not be disturbed by future data, thus enabling causality in time series data. Causal convolutional neural networks take advantage of this property to better capture causal relationships when processing time series data. Therefore, by introducing causal convolution kernels, time series data can be effectively processed and the performance of the model can be improved.

In addition to causal convolution kernels, causal convolutional neural networks have other implementation methods, such as the introduction of causal pooling and residual structures. Causal pooling is a special pooling operation that preserves the causal relationship of time series data. In causal pooling, each pooling window only contains data at the current time point and before, and does not include future data. This effectively avoids information leakage and improves the stability and robustness of the model.

Give a simple example. First, you need to import the necessary libraries and modules:

import torch
import torch.nn as nn
import torch.optim as optim
import numpy as np
import pandas as pd
from sklearn.preprocessing import MinMaxScaler
Copy after login

Then, read and process data:

data = pd.read_csv('temperature.csv')
scaler = MinMaxScaler(feature_range=(-1, 1))
data['scaled_temperature'] = scaler.fit_transform(data['temperature'].values.reshape(-1, 1))
data.drop(['temperature'], axis=1, inplace=True)
Copy after login

Then, divide the data set into a training set and a test set:

train_size = int(len(data) * 0.8)
test_size = len(data) - train_size
train_data, test_data = data.iloc[0:train_size], data.iloc[train_size:len(data)]
Copy after login

Next, define the causal convolutional neural network model:

class CCN(nn.Module):
    def __init__(self, input_size, output_size, num_filters, kernel_size):
        super(CCN, self).__init__()
        self.conv1 = nn.Conv1d(input_size, num_filters, kernel_size, padding=kernel_size - 1)
        self.conv2 = nn.Conv1d(num_filters, num_filters, kernel_size, padding=kernel_size - 1)
        self.conv3 = nn.Conv1d(num_filters, num_filters, kernel_size, padding=kernel_size - 1)
        self.conv4 = nn.Conv1d(num_filters, num_filters, kernel_size, padding=kernel_size - 1)
        self.conv5 = nn.Conv1d(num_filters, num_filters, kernel_size, padding=kernel_size - 1)
        self.conv6 = nn.Conv1d(num_filters, num_filters, kernel_size, padding=kernel_size - 1)
        self.conv7 = nn.Conv1d(num_filters, num_filters, kernel_size, padding=kernel_size - 1)
        self.conv8 = nn.Conv1d(num_filters, num_filters, kernel_size, padding=kernel_size - 1)
        self.conv9 = nn.Conv1d(num_filters, num_filters, kernel_size, padding=kernel_size - 1)
        self.conv10 = nn.Conv1d(num_filters, output_size, kernel_size, padding=kernel_size - 1)

    def forward(self, x):
        x = torch.relu(self.conv1(x))
        x = torch.relu(self.conv2(x))
        x = torch.relu(self.conv3(x))
        x = torch.relu(self.conv4(x))
        x = torch.relu(self.conv5(x))
        x = torch.relu(self.conv6(x))
        x = torch.relu(self.conv7(x))
        x = torch.relu(self.conv8(x))
        x = torch.relu(self.conv9(x))
        x = self.conv10(x)
        return x
Copy after login

After the model definition is completed, the data needs to be pre-processed processed so that it can be input into the model. We convert the data into PyTorch's Tensor type and convert it into a 3D tensor, that is, in the form of (batch_size, sequence_length, input_size):

def create_sequences(data, seq_length):
    xs = []
    ys = []
    for i in range(len(data) - seq_length - 1):
        x = data[i:(i + seq_length)]
        y = data[i + seq_length]
        xs.append(x)
        ys.append(y)
    return np.array(xs), np.array(ys)

sequence_length = 10
trainX, trainY = create_sequences(train_data['scaled_temperature'], sequence_length)
testX, testY = create_sequences(test_data['scaled_temperature'], sequence_length)

trainX = torch.from_numpy(trainX).float()
trainY = torch.from_numpy(trainY).float()
testX = torch.from_numpy(testX).float()
testY = torch.from_numpy(testY).float()

trainX = trainX.view(-1, sequence_length, 1)
trainY = trainY.view(-1, 1)
testX = testX.view(-1, sequence_length, 1)
testY = testY.view(-1, 1)
Copy after login

Next, define the training process:

num_epochs = 1000
learning_rate = 0.001
num_filters = 64
kernel_size = 2

model = CCN(input_size=1, output_size=1, num_filters=num_filters, kernel_size=kernel_size)
criterion = nn.MSELoss()
optimizer= optim.Adam(model.parameters(), lr=learning_rate)

for epoch in range(num_epochs):
    optimizer.zero_grad()
    outputs = model(trainX)
    loss = criterion(outputs, trainY)
    loss.backward()
    optimizer.step()

    if epoch % 100 == 0:
        print('Epoch [{}/{}], Loss: {:.4f}'.format(epoch+1, num_epochs, loss.item()))
Copy after login

Finally, use the test set to evaluate the model:

with torch.no_grad():
    test_outputs = model(testX)
    test_loss = criterion(test_outputs, testY)
    print('Test Loss: {:.4f}'.format(test_loss.item()))

    test_outputs = scaler.inverse_transform(test_outputs.numpy())
    testY = scaler.inverse_transform(testY.numpy())

    test_outputs = np.squeeze(test_outputs)
    testY = np.squeeze(testY)

    plt.plot(test_outputs, label='Predicted')
    plt.plot(testY, label='True')
    plt.legend()
    plt.show()
Copy after login

The above is the implementation process of a simple causal convolutional neural network model, which can be used to predict time series data. It should be noted that in actual applications, the model may need to be adjusted and optimized according to specific tasks to achieve better performance.

Compared with traditional convolutional neural networks, causal convolutional neural networks have unique advantages when processing time series data. It can effectively avoid information leakage problems and better preserve the causal relationship of time series. Therefore, in the prediction and analysis of time series data, causal convolutional neural networks have shown good performance on some tasks. For example, in fields such as speech recognition, natural language processing, and stock prediction, causal convolutional neural networks have been widely used and have achieved some impressive results.

The above is the detailed content of causal convolutional neural network. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:163.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!