神经网络 – 如何在Keras中添加正交性正则化?
发布时间:2020-12-14 02:29:03 所属栏目:百科 来源:网络整理
导读:我想用CNN来规范CNN的层次 |(W^T * W - I)| 我怎么能在Keras那样做? 来自文档: Any function that takes in a weight matrix and returns a loss contribution tensor can be used as a regularizer 以下是实施的示例: from keras import backend as Kdef
我想用CNN来规范CNN的层次
|(W^T * W - I)| 我怎么能在Keras那样做?
来自文档:
以下是实施的示例: from keras import backend as K def l1_reg(weight_matrix): return 0.01 * K.sum(K.abs(weight_matrix)) model.add(Dense(64,input_dim=64,kernel_regularizer=l1_reg) 您的帖子中的损失将是: from keras import backend as K def fro_norm(w): return K.sqrt(K.sum(K.square(K.abs(w)))) def cust_reg(w): m = K.dot(K.transpose(w),w) - np.eye(w.shape) return fro_norm(m) 这是一个最小的例子: import numpy as np from keras import backend as K from keras.models import Sequential from keras.layers import Dense,Activation X = np.random.randn(100,100) y = np.random.randint(2,size=(100,1)) model = Sequential() # apply regularization here. applies regularization to the # output (activation) of the layer model.add(Dense(32,input_shape=(100,),activity_regularizer=fro_norm)) model.add(Dense(1)) model.add(Activation('softmax')) model.compile(loss="binary_crossentropy",optimizer='sgd',metrics=['accuracy']) model.fit(X,y,epochs=1,batch_size=32) 下面不会像@ Marcin的评论所暗示的那样LA.norm不起作用,因为正规化者必须返回Tensor LA.norm()不会. def orth_norm(w) m = K.dot(k.transpose(w),w) - np.eye(w.shape) return LA.norm(m,'fro') from keras import backend as K import numpy as np def orth_norm(w) m = K.dot(k.transpose(w),'fro') Keras regularizers Frobenias norm (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |