如何在 Python 中使用函数式 API 处理残差连接?


Keras 存在于 Tensorflow 包中。可以使用以下代码行访问它。

import tensorflow
from tensorflow import keras

Keras 函数式 API 有助于创建比使用顺序式 API 创建的模型更灵活的模型。函数式 API 可以处理具有非线性拓扑的模型,可以共享层,并可以处理多个输入和输出。深度学习模型通常是一个包含多个层的无环有向图 (DAG)。函数式 API 有助于构建图层图。

我们正在使用 Google Colaboratory 来运行以下代码。Google Colab 或 Colaboratory 有助于在浏览器上运行 Python 代码,无需任何配置,并且可以免费访问 GPU(图形处理单元)。Colaboratory 建立在 Jupyter Notebook 之上。以下是代码片段:

示例

print("Toy ResNet model for CIFAR10")
print("Layers generated for model")
inputs = keras.Input(shape=(32, 32, 3), name="img")
x = layers.Conv2D(32, 3, activation="relu")(inputs)
x = layers.Conv2D(64, 3, activation="relu")(x)
block_1_output = layers.MaxPooling2D(3)(x)

x = layers.Conv2D(64, 3, activation="relu", padding="same")(block_1_output)
x = layers.Conv2D(64, 3, activation="relu", padding="same")(x)
block_2_output = layers.add([x, block_1_output])

x = layers.Conv2D(64, 3, activation="relu", padding="same")(block_2_output)
x = layers.Conv2D(64, 3, activation="relu", padding="same")(x)
block_3_output = layers.add([x, block_2_output])

x = layers.Conv2D(64, 3, activation="relu")(block_3_output)
x = layers.GlobalAveragePooling2D()(x)
x = layers.Dense(256, activation="relu")(x)
x = layers.Dropout(0.5)(x)
outputs = layers.Dense(10)(x)

model = keras.Model(inputs, outputs, name="toy_resnet")
print("More information about the model")
model.summary()

代码来源 − https://tensorflowcn.cn/guide/keras/functional

输出

Toy ResNet model for CIFAR10
Layers generated for model
More information about the model
Model: "toy_resnet"
________________________________________________________________________________
__________________
Layer (type)          Output Shape          Param #       Connected to
================================================================================
==================
img (InputLayer)       [(None, 32, 32, 3)]    0
________________________________________________________________________________
__________________
conv2d_32 (Conv2D)    (None, 30, 30, 32)     896          img[0][0]
________________________________________________________________________________
__________________
conv2d_33 (Conv2D)    (None, 28, 28, 64)    18496         conv2d_32[0][0]
________________________________________________________________________________
__________________
max_pooling2d_8 (MaxPooling2D) (None, 9, 9, 64) 0          conv2d_33[0][0]
________________________________________________________________________________
__________________
conv2d_34 (Conv2D)       (None, 9, 9, 64)       36928       max_pooling2d_8[0][0]
________________________________________________________________________________
__________________
conv2d_35 (Conv2D)       (None, 9, 9, 64)       36928       conv2d_34[0][0]
________________________________________________________________________________
__________________
add_12 (Add)             (None, 9, 9, 64)          0       conv2d_35[0][0]
                                          max_pooling2d_8[0][0]
________________________________________________________________________________
__________________
conv2d_36 (Conv2D)          (None, 9, 9, 64)    36928       add_12[0][0]
________________________________________________________________________________
__________________
conv2d_37 (Conv2D)          (None, 9, 9, 64)    36928       conv2d_36[0][0]
________________________________________________________________________________
__________________
add_13 (Add)                (None, 9, 9, 64)       0       conv2d_37[0][0]
                                       add_12[0][0]
________________________________________________________________________________
__________________
conv2d_38 (Conv2D)          (None, 7, 7, 64)    36928       add_13[0][0]
________________________________________________________________________________
__________________
global_average_pooling2d_1    (Glo (None, 64)      0       conv2d_38[0][0]
________________________________________________________________________________
__________________
dense_40 (Dense)             (None, 256)          16640    global_average_pooling2d_1[0][0]
________________________________________________________________________________
__________________
dropout_2 (Dropout)          (None, 256)          0          dense_40[0][0]
________________________________________________________________________________
__________________
dense_41 (Dense)             (None, 10)          2570       dropout_2[0][0]
================================================================================
==================
Total params: 223,242
Trainable params: 223,242
Non-trainable params: 0
________________________________________________________________________________
__________________

解释

  • 该模型具有多个输入和输出。

  • 函数式 API 简化了与非线性连接拓扑的工作。

  • 此模型的各层并非按顺序连接,因此“Sequential”API 无法处理它。

  • 这就是残差连接发挥作用的地方。

  • 构建了一个使用 CIFAR10 的示例 ResNet 模型来演示这一点。

更新于: 2021年1月18日

211 次浏览

启动您的职业生涯

完成课程获得认证

开始
广告