## 谱范数 #

### 幂迭代 #

，初始化为，同样假设可对角化，并且假设的各个特征根中，最大的特征根严格大于其余的特征根（不满足这个条件意味着最大的特征根是重根，讨论起来有点复杂，需要请读者查找专业证明，这里仅仅抛砖引玉。当然，从数值计算的角度，几乎没有两个人是完全相等的，因此可以认为重根的情况在实验中不会出现。），那么的各个特征向量构成完备的基底，所以我们可以设

### 谱正则化 #

《Spectral Norm Regularization for Improving the Generalizability of Deep Learning》一文已经做了多个实验，表明“谱正则化”在多个任务上都能提升模型性能。

def spectral_norm(w, r=5):
w_shape = K.int_shape(w)
in_dim = np.prod(w_shape[:-1]).astype(int)
out_dim = w_shape[-1]
w = K.reshape(w, (in_dim, out_dim))
u = K.ones((1, in_dim))
for i in range(r):
v = K.l2_normalize(K.dot(u, w))
u = K.l2_normalize(K.dot(v, K.transpose(w)))
return K.sum(K.dot(K.dot(u, w), K.transpose(v)))

## 生成模型 #

### Keras实现 #

def spectral_normalization(w):
return w / spectral_norm(w)

https://github.com/bojone/gan/blob/master/keras/wgan_sn_celeba.py

    def add_weight(self,
name,
shape,
dtype=None,
initializer=None,
regularizer=None,
trainable=True,
constraint=None):
"""Adds a weight variable to the layer.
# Arguments
name: String, the name for the weight variable.
shape: The shape tuple of the weight.
dtype: The dtype of the weight.
initializer: An Initializer instance (callable).
regularizer: An optional Regularizer instance.
trainable: A boolean, whether the weight should
be trained via backprop or not (assuming
that the layer itself is also trainable).
constraint: An optional Constraint instance.
# Returns
The created weight variable.
"""
initializer = initializers.get(initializer)
if dtype is None:
dtype = K.floatx()
weight = K.variable(initializer(shape),
dtype=dtype,
name=name,
constraint=constraint)
if regularizer is not None:
with K.name_scope('weight_regularizer'):
if trainable:
self._trainable_weights.append(weight)
else:
self._non_trainable_weights.append(weight)
return weight

    def add_weight(self,
name,
shape,
dtype=None,
initializer=None,
regularizer=None,
trainable=True,
constraint=None):
"""Adds a weight variable to the layer.
# Arguments
name: String, the name for the weight variable.
shape: The shape tuple of the weight.
dtype: The dtype of the weight.
initializer: An Initializer instance (callable).
regularizer: An optional Regularizer instance.
trainable: A boolean, whether the weight should
be trained via backprop or not (assuming
that the layer itself is also trainable).
constraint: An optional Constraint instance.
# Returns
The created weight variable.
"""
initializer = initializers.get(initializer)
if dtype is None:
dtype = K.floatx()
weight = K.variable(initializer(shape),
dtype=dtype,
name=name,
constraint=None)
if regularizer is not None:
with K.name_scope('weight_regularizer'):
if trainable:
self._trainable_weights.append(weight)
else:
self._non_trainable_weights.append(weight)
if constraint is not None:
return constraint(weight)
return weight

（把K.variable的constraint改为None，把constraint放到最后执行～）