admin管理员组

文章数量:1320661

I am trying to get a custom version of the EelSpecNet model working, which is based on the Unet architecture. The original large version can be found here: text

The model ist used to reconstruct an optical spectrum from multispectral sensor data, whereas the sensor channels have a wide spectral distribution and are not tight bandpasses.

The original model ist too large for my data, I am trying to shrink it to input/output size of 96. When I run the model it gives me the error that 'Dimensions must be equal, but are 96 and 48'. I am new to this model and would really appreciate some hint on what to change.

Here the code:

import tensorflow as tf

class EELSpecNetModel_CNN_96ST(tf.keras.Model):

    def __init__(self):
        super(EELSpecNetModel_CNN_96ST, self).__init__()

        kerl_size = 4

        self.conv_96x1 = tf.keras.layers.Conv2D(1, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        self.conv_48x2 = tf.keras.layers.Conv2D(2, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        self.conv_24x4 = tf.keras.layers.Conv2D(4, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        self.conv_12x8 = tf.keras.layers.Conv2D(8, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        self.conv_6x16 = tf.keras.layers.Conv2D(16, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        self.conv_3x32 = tf.keras.layers.Conv2D(32, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        
        # =======================================================================

        self.deconv_6x16  = tf.keras.layers.Conv2DTranspose(16, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        self.deconv_12x8  = tf.keras.layers.Conv2DTranspose(8, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        self.deconv_24x4  = tf.keras.layers.Conv2DTranspose(4, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        self.deconv_48x2  = tf.keras.layers.Conv2DTranspose(2, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        self.deconv_96x1  = tf.keras.layers.Conv2DTranspose(1, (1, kerl_size), strides=(1, 2), activation='relu', padding='same', kernel_initializer='random_uniform')
        
        self.concat = tf.keras.layers.concatenate
        self.relu = tf.keras.activations.relu

    def call(self, inputs):
        enc_96x1 = self.conv_96x1(inputs)
        enc_48x2 = self.conv_48x2(enc_96x1)
        enc_24x4 = self.conv_24x4(enc_48x2)
        enc_12x8 = self.conv_12x8(enc_24x4)
        enc_6x16 = self.conv_6x16(enc_12x8)
        enc_3x32 = self.conv_3x32(enc_6x16)
        # =======================================================================
        dcd_6x16 = self.deconv_6x16(enc_3x32)
        dcd_6x16x2 = self.concat([dcd_6x16, enc_6x16], axis=-1)
        dcd_12x8 = self.deconv_12x8(dcd_6x16x2)
        dcd_12x8x2 = self.concat([dcd_12x8, enc_12x8], axis=-1)
        dcd_24x4 = self.deconv_24x4(dcd_12x8x2)
        dcd_24x4x2 = self.concat([dcd_24x4, enc_24x4], axis=-1)
        dcd_48x2 = self.deconv_48x2(dcd_24x4x2)
        dcd_48x2x2 = self.concat([dcd_48x2, enc_48x2], axis=-1)
        dcd_96x1 = self.deconv_96x1(dcd_48x2x2)

        return (dcd_96x1)

I tried to get help with Chatgpt, but its not really getting the complexity right. I also tried other versions with 16 and 64 inputs, but it appears that I have some basic problem in understanding the architecture.

本文标签: