admin管理员组

文章数量:1208153

I am working on optimizing a table that currently stores BLOB data in a VARBINARY(MAX) column.

The current table structure looks like this:

CREATE TABLE bigData
(
    Id INT NOT NULL PRIMARY KEY,
    data VARBINARY(MAX) NOT NULL,
    dataName NVARCHAR(MAX) NOT NULL
);

I want to migrate the data from the data column to a new column that uses the FILESTREAM feature to reduce the database load. I've already added the necessary columns to the table:

ALTER TABLE bigData
    ADD Guid UNIQUEIDENTIFIER ROWGUIDCOL NOT NULL UNIQUE DEFAULT NEWSEQUENTIALID();

ALTER TABLE bigData
    ADD dataFileStream VARBINARY(MAX) FILESTREAM NULL;

The table contains over 150,000 rows (~180 GB of data), making a direct UPDATE impractical due to the potential load on the database. Also making a new table with all filestream columns and transferring data is off table.

My objectives

Migrate the data from data table to dataFileStream with minimal performance impact.

Delete the data column after a successful migration.

What I've considered so far

Using BCP to export and re-import data. However, I've realized that BCP operates on entire tables or views, not individual columns, which complicates the process.

My question

Is there a better method to export BLOB data to files and re-import it with minimal resource usage?

本文标签: