SQL Server Updating Inserting And Deleting From Large Datasets

Using SQL Server Updating Inserting And Deleting From Large Datasets
When working with particularly large datasets, there can be issues with locking, however it doesn't need to be like this. What we can do is break out the transactions into smaller pieces.
Lets look at some of the code below. First of all open SSMS and three query windows, and the Activity Monitor, set the update period to one second. 
In the first window, we can create a table.

CREATE TABLE (

ID BIGINT IDENTITY(1,1) CONSTRAINT PK_ID PRIMARY KEY,

GUID UNIQUEIDENTIFIER)

CREATE INDEX IDX_GUID ON (GUID)

Execute this code and then replace the code with below, which will insert 5 million random GUID's into the table one at a time, this then simulates constant updating and inserting for us. You can push this further if you like. It will also print the number of records inserted every 1000 rows, so you can keep track of it.

SET NOCOUNT ON

DECLARE @ID BIGINT=1

WHILE @ID<5000000 BEGIN

INSERT INTO (GUID) SELECT NEWID()

IF @ID%1000=0 PRINT @ID

SET @ID=@ID+1

END

In the second window, add the code below, which will select 10000 records that begin with 20.

SELECT TOP 10000 * FROM WHERE GUID Like '%20'

Now in the third window, you can run this, which will loop through the GUID's and delete where they begin with a 0.

WHILE EXISTS (SELECT TOP 1 * FROM WHERE GUID LIKE '0%') BEGIN

DELETE TOP (1000) FROM WHERE GUID LIKE '0%'

END

Once you get above 500000 records, run either or both of the statements in windows two and three, and check your activity monitor for what happens, you should see very few locks, and while the system is generating locks, it will not prevent users from carrying on doing the work they need to, although it may slow down to a certain extent.
Combine the code above with the automatic re-indexing, and it should keep your system in perfect working order.
Ousia Logo