加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 大数据 > 正文

lua – torch7神经网络训练错误

发布时间:2020-12-14 21:43:30 所属栏目:大数据 来源:网络整理
导读:我正在尝试在torch7中实现一个神经网络示例.我的数据以这种形式[19 cols x 10000 rows]存储在一个文本文件中: 11 38 20 44 11 38 21 44 29 42 30 44 34 38 6 34 45 42 111 38 20 44 11 38 27 44 31 42 18 44 34 38 6 34 45 42 26 42 20 44 11 38 21 44 29 4
我正在尝试在torch7中实现一个神经网络示例.我的数据以这种形式[19 cols x 10000 rows]存储在一个文本文件中:

11 38 20 44 11 38 21 44 29 42 30 44 34 38  6 34 45 42 1
11 38 20 44 11 38 27 44 31 42 18 44 34 38  6 34 45 42 2
6  42 20 44 11 38 21 44 29 42 30 44 34 38  6 34 45 42 3
...
34 40 20 44 11 38 21 44 29 38 30 38 34 45 38  0  0  0 100
...

最后一栏中有标签[100个标签].

使用此代码:

require 'nn'
-- ======================================= --
--           Start loading data   
-- ======================================= --
print '[INFO] Loading data..'
dataset = {}
function dataset:size() return 10000 end 
local lin = 1

train_file = 'train_10000.t7'
local file = io.open(train_file)
if file then
    for line in file:lines() do
            local input = torch.Tensor(18);
            local output = torch.Tensor(1);

        local X1,X2,X3,X4,X5,X6,X7,X8,X9,X10,X11,X12,X13,X14,X15,X16,X17,X18,Y = unpack(line:split(" "))                

        input  = {X1,X18}
        output = Y

        dataset[lin] = {input,output}        
        lin = lin +1        
    end
end
-- ======================================= --
--                 Create NN   
-- ======================================= --
print '[INFO] Creating NN..'
mlp = nn.Sequential();  -- make a multi-layer perceptron
inputs = 18; outputs = 1; HUs = 25; -- parameters
mlp:add(nn.Linear(inputs,HUs))
mlp:add(nn.Tanh())
mlp:add(nn.Linear(HUs,outputs))
-- ======================================= --
--           MSE and Training  
-- ======================================= --
print '[INFO] MSE and train NN..'
criterion = nn.MSECriterion()  
trainer = nn.StochasticGradient(mlp,criterion)
trainer.learningRate = 0.01
trainer:train(dataset)

我收到此错误消息:

# StochasticGradient: training  
/home/yosaikan/torch/install/share/lua/5.1/nn/Linear.lua:34: attempt to call method 'dim' (a nil value)
stack traceback:
    /home/yosaikan/torch/install/share/lua/5.1/nn/Linear.lua:34: in function 'updateOutput'
    ...e/yosaikan/torch/install/share/lua/5.1/nn/Sequential.lua:25: in function 'forward'
    ...an/torch/install/share/lua/5.1/nn/StochasticGradient.lua:35: in function 'train'
    iparseSchemeConversion.lua:45: in main chunk
    [C]: in function 'f'
    [string "local f = function() return dofile 'iparseSch..."]:1: in main chunk
    [C]: in function 'xpcall'
    /home/yosaikan/torch/install/share/lua/5.1/itorch/main.lua:174: in function </home/yosaikan/torch/install/share/lua/5.1/itorch/main.lua:140>
    /home/yosaikan/torch/install/share/lua/5.1/lzmq/poller.lua:75: in function 'poll'
    .../yosaikan/torch/install/share/lua/5.1/lzmq/impl/loop.lua:307: in function 'poll'
    .../yosaikan/torch/install/share/lua/5.1/lzmq/impl/loop.lua:325: in function 'sleep_ex'
    .../yosaikan/torch/install/share/lua/5.1/lzmq/impl/loop.lua:370: in function 'start'
    /home/yosaikan/torch/install/share/lua/5.1/itorch/main.lua:341: in main chunk
    [C]: in function 'require'
    (command line):1: in main chunk
    [C]: at 0x00405980

你能帮我么 ?

谢谢.

解决方法

I got this error message […] Can you please help me?

在你的数据集中,输入和输出应该是Tensor-s(这里输入的是一个普通的Lua表,这就是你得到这个错误的原因,即没有昏暗的方法).

为了简化数据加载,我建议您使用csv parser,例如,您可以使用csv2tensor将数据加载到Tensor中.

首先确保在文件中添加标题(作为第一行),如:

x001,x002,x003,x004,x005,x006,x007,x008,x009,x010,x011,x012,x013,x014,x015,x016,x017,x018,label

然后加载您的数据如下:

local csv2tensor = require 'csv2tensor'

local inputs = csv2tensor.load("data.csv",{exclude={"label"}})
local labels = csv2tensor.load("data.csv",{include={"label"}})

local dataset = {}

for i=1,inputs:size(1) do
  dataset[i] = {inputs[i],torch.Tensor{labels[i]}}
end

dataset.size = function(self)
  return inputs:size(1)
end

并使用此数据集进行培训:

-- ...
trainer:train(dataset)

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读